2025-09-23 18:28:55.360968 | Job console starting 2025-09-23 18:28:55.371161 | Updating git repos 2025-09-23 18:28:55.468030 | Cloning repos into workspace 2025-09-23 18:28:55.674288 | Restoring repo states 2025-09-23 18:28:55.705318 | Merging changes 2025-09-23 18:28:55.705340 | Checking out repos 2025-09-23 18:28:55.969932 | Preparing playbooks 2025-09-23 18:28:56.651283 | Running Ansible setup 2025-09-23 18:29:00.852519 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2025-09-23 18:29:01.603789 | 2025-09-23 18:29:01.603989 | PLAY [Base pre] 2025-09-23 18:29:01.620795 | 2025-09-23 18:29:01.620953 | TASK [Setup log path fact] 2025-09-23 18:29:01.651977 | orchestrator | ok 2025-09-23 18:29:01.669522 | 2025-09-23 18:29:01.669668 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-09-23 18:29:01.700023 | orchestrator | ok 2025-09-23 18:29:01.711747 | 2025-09-23 18:29:01.711902 | TASK [emit-job-header : Print job information] 2025-09-23 18:29:01.770106 | # Job Information 2025-09-23 18:29:01.770319 | Ansible Version: 2.16.14 2025-09-23 18:29:01.770355 | Job: testbed-deploy-in-a-nutshell-ubuntu-24.04 2025-09-23 18:29:01.770566 | Pipeline: post 2025-09-23 18:29:01.770620 | Executor: 521e9411259a 2025-09-23 18:29:01.770643 | Triggered by: https://github.com/osism/testbed/commit/ad77bb1e52fd3867b1aae14084041f16aa02d544 2025-09-23 18:29:01.770668 | Event ID: 24bef35c-98ab-11f0-817a-2b9b7b73a959 2025-09-23 18:29:01.780365 | 2025-09-23 18:29:01.780502 | LOOP [emit-job-header : Print node information] 2025-09-23 18:29:01.926365 | orchestrator | ok: 2025-09-23 18:29:01.926649 | orchestrator | # Node Information 2025-09-23 18:29:01.926709 | orchestrator | Inventory Hostname: orchestrator 2025-09-23 18:29:01.926754 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2025-09-23 18:29:01.926792 | orchestrator | Username: zuul-testbed02 2025-09-23 18:29:01.927020 | orchestrator | Distro: Debian 12.12 2025-09-23 18:29:01.927081 | orchestrator | Provider: static-testbed 2025-09-23 18:29:01.927123 | orchestrator | Region: 2025-09-23 18:29:01.927163 | orchestrator | Label: testbed-orchestrator 2025-09-23 18:29:01.927199 | orchestrator | Product Name: OpenStack Nova 2025-09-23 18:29:01.927234 | orchestrator | Interface IP: 81.163.193.140 2025-09-23 18:29:01.948659 | 2025-09-23 18:29:01.948870 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2025-09-23 18:29:02.421584 | orchestrator -> localhost | changed 2025-09-23 18:29:02.439384 | 2025-09-23 18:29:02.439553 | TASK [log-inventory : Copy ansible inventory to logs dir] 2025-09-23 18:29:03.476847 | orchestrator -> localhost | changed 2025-09-23 18:29:03.493248 | 2025-09-23 18:29:03.493375 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2025-09-23 18:29:03.782401 | orchestrator -> localhost | ok 2025-09-23 18:29:03.790077 | 2025-09-23 18:29:03.790202 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2025-09-23 18:29:03.821965 | orchestrator | ok 2025-09-23 18:29:03.839180 | orchestrator | included: /var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2025-09-23 18:29:03.848029 | 2025-09-23 18:29:03.848130 | TASK [add-build-sshkey : Create Temp SSH key] 2025-09-23 18:29:05.481884 | orchestrator -> localhost | Generating public/private rsa key pair. 2025-09-23 18:29:05.482380 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/work/05e476f00ce44eecbe8a69c52322685d_id_rsa 2025-09-23 18:29:05.482470 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/work/05e476f00ce44eecbe8a69c52322685d_id_rsa.pub 2025-09-23 18:29:05.482533 | orchestrator -> localhost | The key fingerprint is: 2025-09-23 18:29:05.482590 | orchestrator -> localhost | SHA256:xWpNJilYtlF5d8C0kXLLd0ZL3QoTLvUwH4gwg2xfTEU zuul-build-sshkey 2025-09-23 18:29:05.482642 | orchestrator -> localhost | The key's randomart image is: 2025-09-23 18:29:05.482709 | orchestrator -> localhost | +---[RSA 3072]----+ 2025-09-23 18:29:05.482762 | orchestrator -> localhost | | .+o==.*E=. o| 2025-09-23 18:29:05.482812 | orchestrator -> localhost | | ++o.=*oOB.o+| 2025-09-23 18:29:05.482916 | orchestrator -> localhost | | ..o.oo*=o=+o.| 2025-09-23 18:29:05.482964 | orchestrator -> localhost | | ..B .o o.o| 2025-09-23 18:29:05.483011 | orchestrator -> localhost | | S . . o | 2025-09-23 18:29:05.483065 | orchestrator -> localhost | | . | 2025-09-23 18:29:05.483113 | orchestrator -> localhost | | | 2025-09-23 18:29:05.483159 | orchestrator -> localhost | | | 2025-09-23 18:29:05.483207 | orchestrator -> localhost | | | 2025-09-23 18:29:05.483256 | orchestrator -> localhost | +----[SHA256]-----+ 2025-09-23 18:29:05.483370 | orchestrator -> localhost | ok: Runtime: 0:00:01.120243 2025-09-23 18:29:05.500249 | 2025-09-23 18:29:05.500382 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2025-09-23 18:29:05.533976 | orchestrator | ok 2025-09-23 18:29:05.545811 | orchestrator | included: /var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2025-09-23 18:29:05.555525 | 2025-09-23 18:29:05.555626 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2025-09-23 18:29:05.583102 | orchestrator | skipping: Conditional result was False 2025-09-23 18:29:05.599997 | 2025-09-23 18:29:05.600143 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2025-09-23 18:29:06.192182 | orchestrator | changed 2025-09-23 18:29:06.203006 | 2025-09-23 18:29:06.203140 | TASK [add-build-sshkey : Make sure user has a .ssh] 2025-09-23 18:29:06.486947 | orchestrator | ok 2025-09-23 18:29:06.497398 | 2025-09-23 18:29:06.497546 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2025-09-23 18:29:06.923022 | orchestrator | ok 2025-09-23 18:29:06.931462 | 2025-09-23 18:29:06.931590 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2025-09-23 18:29:07.353157 | orchestrator | ok 2025-09-23 18:29:07.361589 | 2025-09-23 18:29:07.361718 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2025-09-23 18:29:07.389740 | orchestrator | skipping: Conditional result was False 2025-09-23 18:29:07.403453 | 2025-09-23 18:29:07.403598 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2025-09-23 18:29:07.858648 | orchestrator -> localhost | changed 2025-09-23 18:29:07.874957 | 2025-09-23 18:29:07.875075 | TASK [add-build-sshkey : Add back temp key] 2025-09-23 18:29:08.228136 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/work/05e476f00ce44eecbe8a69c52322685d_id_rsa (zuul-build-sshkey) 2025-09-23 18:29:08.228698 | orchestrator -> localhost | ok: Runtime: 0:00:00.019305 2025-09-23 18:29:08.239578 | 2025-09-23 18:29:08.239689 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2025-09-23 18:29:08.673669 | orchestrator | ok 2025-09-23 18:29:08.682811 | 2025-09-23 18:29:08.682984 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2025-09-23 18:29:08.718793 | orchestrator | skipping: Conditional result was False 2025-09-23 18:29:08.786327 | 2025-09-23 18:29:08.786465 | TASK [start-zuul-console : Start zuul_console daemon.] 2025-09-23 18:29:09.186343 | orchestrator | ok 2025-09-23 18:29:09.204750 | 2025-09-23 18:29:09.204940 | TASK [validate-host : Define zuul_info_dir fact] 2025-09-23 18:29:09.251836 | orchestrator | ok 2025-09-23 18:29:09.261912 | 2025-09-23 18:29:09.262039 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2025-09-23 18:29:09.575296 | orchestrator -> localhost | ok 2025-09-23 18:29:09.583904 | 2025-09-23 18:29:09.584028 | TASK [validate-host : Collect information about the host] 2025-09-23 18:29:10.784634 | orchestrator | ok 2025-09-23 18:29:10.801582 | 2025-09-23 18:29:10.801689 | TASK [validate-host : Sanitize hostname] 2025-09-23 18:29:10.867247 | orchestrator | ok 2025-09-23 18:29:10.875583 | 2025-09-23 18:29:10.875711 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2025-09-23 18:29:11.428970 | orchestrator -> localhost | changed 2025-09-23 18:29:11.441844 | 2025-09-23 18:29:11.441997 | TASK [validate-host : Collect information about zuul worker] 2025-09-23 18:29:11.861454 | orchestrator | ok 2025-09-23 18:29:11.869025 | 2025-09-23 18:29:11.869155 | TASK [validate-host : Write out all zuul information for each host] 2025-09-23 18:29:12.408760 | orchestrator -> localhost | changed 2025-09-23 18:29:12.419966 | 2025-09-23 18:29:12.420073 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2025-09-23 18:29:12.729509 | orchestrator | ok 2025-09-23 18:29:12.739880 | 2025-09-23 18:29:12.740012 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2025-09-23 18:29:51.729367 | orchestrator | changed: 2025-09-23 18:29:51.729621 | orchestrator | .d..t...... src/ 2025-09-23 18:29:51.729664 | orchestrator | .d..t...... src/github.com/ 2025-09-23 18:29:51.729694 | orchestrator | .d..t...... src/github.com/osism/ 2025-09-23 18:29:51.729722 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2025-09-23 18:29:51.729748 | orchestrator | RedHat.yml 2025-09-23 18:29:51.743945 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2025-09-23 18:29:51.743963 | orchestrator | RedHat.yml 2025-09-23 18:29:51.744016 | orchestrator | = 2.2.0"... 2025-09-23 18:30:03.209661 | orchestrator | 18:30:03.209 STDOUT terraform: - Finding latest version of hashicorp/null... 2025-09-23 18:30:03.237410 | orchestrator | 18:30:03.237 STDOUT terraform: - Finding terraform-provider-openstack/openstack versions matching ">= 1.53.0"... 2025-09-23 18:30:03.933759 | orchestrator | 18:30:03.933 STDOUT terraform: - Installing hashicorp/local v2.5.3... 2025-09-23 18:30:04.560878 | orchestrator | 18:30:04.560 STDOUT terraform: - Installed hashicorp/local v2.5.3 (signed, key ID 0C0AF313E5FD9F80) 2025-09-23 18:30:04.636566 | orchestrator | 18:30:04.636 STDOUT terraform: - Installing hashicorp/null v3.2.4... 2025-09-23 18:30:05.105663 | orchestrator | 18:30:05.105 STDOUT terraform: - Installed hashicorp/null v3.2.4 (signed, key ID 0C0AF313E5FD9F80) 2025-09-23 18:30:05.182264 | orchestrator | 18:30:05.182 STDOUT terraform: - Installing terraform-provider-openstack/openstack v3.3.2... 2025-09-23 18:30:05.910293 | orchestrator | 18:30:05.910 STDOUT terraform: - Installed terraform-provider-openstack/openstack v3.3.2 (signed, key ID 4F80527A391BEFD2) 2025-09-23 18:30:05.910332 | orchestrator | 18:30:05.910 STDOUT terraform: Providers are signed by their developers. 2025-09-23 18:30:05.910394 | orchestrator | 18:30:05.910 STDOUT terraform: If you'd like to know more about provider signing, you can read about it here: 2025-09-23 18:30:05.910423 | orchestrator | 18:30:05.910 STDOUT terraform: https://opentofu.org/docs/cli/plugins/signing/ 2025-09-23 18:30:05.911352 | orchestrator | 18:30:05.910 STDOUT terraform: OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2025-09-23 18:30:05.911360 | orchestrator | 18:30:05.910 STDOUT terraform: selections it made above. Include this file in your version control repository 2025-09-23 18:30:05.911366 | orchestrator | 18:30:05.910 STDOUT terraform: so that OpenTofu can guarantee to make the same selections by default when 2025-09-23 18:30:05.911371 | orchestrator | 18:30:05.910 STDOUT terraform: you run "tofu init" in the future. 2025-09-23 18:30:05.911375 | orchestrator | 18:30:05.910 STDOUT terraform: OpenTofu has been successfully initialized! 2025-09-23 18:30:05.911379 | orchestrator | 18:30:05.911 STDOUT terraform: You may now begin working with OpenTofu. Try running "tofu plan" to see 2025-09-23 18:30:05.911383 | orchestrator | 18:30:05.911 STDOUT terraform: any changes that are required for your infrastructure. All OpenTofu commands 2025-09-23 18:30:05.911387 | orchestrator | 18:30:05.911 STDOUT terraform: should now work. 2025-09-23 18:30:05.911391 | orchestrator | 18:30:05.911 STDOUT terraform: If you ever set or change modules or backend configuration for OpenTofu, 2025-09-23 18:30:05.911395 | orchestrator | 18:30:05.911 STDOUT terraform: rerun this command to reinitialize your working directory. If you forget, other 2025-09-23 18:30:05.911400 | orchestrator | 18:30:05.911 STDOUT terraform: commands will detect it and remind you to do so if necessary. 2025-09-23 18:30:06.029088 | orchestrator | 18:30:06.028 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed02/terraform` instead. 2025-09-23 18:30:06.029130 | orchestrator | 18:30:06.029 WARN  The `workspace` command is deprecated and will be removed in a future version of Terragrunt. Use `terragrunt run -- workspace` instead. 2025-09-23 18:30:06.191204 | orchestrator | 18:30:06.190 STDOUT terraform: Created and switched to workspace "ci"! 2025-09-23 18:30:06.191281 | orchestrator | 18:30:06.190 STDOUT terraform: You're now on a new, empty workspace. Workspaces isolate their state, 2025-09-23 18:30:06.191291 | orchestrator | 18:30:06.190 STDOUT terraform: so if you run "tofu plan" OpenTofu will not see any existing state 2025-09-23 18:30:06.191298 | orchestrator | 18:30:06.191 STDOUT terraform: for this configuration. 2025-09-23 18:30:06.323073 | orchestrator | 18:30:06.322 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed02/terraform` instead. 2025-09-23 18:30:06.323120 | orchestrator | 18:30:06.323 WARN  The `fmt` command is deprecated and will be removed in a future version of Terragrunt. Use `terragrunt run -- fmt` instead. 2025-09-23 18:30:06.443337 | orchestrator | 18:30:06.442 STDOUT terraform: ci.auto.tfvars 2025-09-23 18:30:06.448343 | orchestrator | 18:30:06.447 STDOUT terraform: default_custom.tf 2025-09-23 18:30:06.550851 | orchestrator | 18:30:06.550 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed02/terraform` instead. 2025-09-23 18:30:07.361281 | orchestrator | 18:30:07.360 STDOUT terraform: data.openstack_networking_network_v2.public: Reading... 2025-09-23 18:30:07.931561 | orchestrator | 18:30:07.931 STDOUT terraform: data.openstack_networking_network_v2.public: Read complete after 1s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2025-09-23 18:30:08.989980 | orchestrator | 18:30:08.989 STDOUT terraform: OpenTofu used the selected providers to generate the following execution 2025-09-23 18:30:08.990166 | orchestrator | 18:30:08.989 STDOUT terraform: plan. Resource actions are indicated with the following symbols: 2025-09-23 18:30:08.990210 | orchestrator | 18:30:08.989 STDOUT terraform:  + create 2025-09-23 18:30:08.990264 | orchestrator | 18:30:08.989 STDOUT terraform:  <= read (data resources) 2025-09-23 18:30:08.990294 | orchestrator | 18:30:08.990 STDOUT terraform: OpenTofu will perform the following actions: 2025-09-23 18:30:08.990301 | orchestrator | 18:30:08.990 STDOUT terraform:  # data.openstack_images_image_v2.image will be read during apply 2025-09-23 18:30:08.990365 | orchestrator | 18:30:08.990 STDOUT terraform:  # (config refers to values not yet known) 2025-09-23 18:30:08.990428 | orchestrator | 18:30:08.990 STDOUT terraform:  <= data "openstack_images_image_v2" "image" { 2025-09-23 18:30:08.990435 | orchestrator | 18:30:08.990 STDOUT terraform:  + checksum = (known after apply) 2025-09-23 18:30:08.990440 | orchestrator | 18:30:08.990 STDOUT terraform:  + created_at = (known after apply) 2025-09-23 18:30:08.990450 | orchestrator | 18:30:08.990 STDOUT terraform:  + file = (known after apply) 2025-09-23 18:30:08.990465 | orchestrator | 18:30:08.990 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:08.990514 | orchestrator | 18:30:08.990 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:08.990550 | orchestrator | 18:30:08.990 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-09-23 18:30:08.990563 | orchestrator | 18:30:08.990 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-09-23 18:30:08.990618 | orchestrator | 18:30:08.990 STDOUT terraform:  + most_recent = true 2025-09-23 18:30:08.990647 | orchestrator | 18:30:08.990 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:08.990696 | orchestrator | 18:30:08.990 STDOUT terraform:  + protected = (known after apply) 2025-09-23 18:30:08.990780 | orchestrator | 18:30:08.990 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:08.990787 | orchestrator | 18:30:08.990 STDOUT terraform:  + schema = (known after apply) 2025-09-23 18:30:08.990825 | orchestrator | 18:30:08.990 STDOUT terraform:  + size_bytes = (known after apply) 2025-09-23 18:30:08.990937 | orchestrator | 18:30:08.990 STDOUT terraform:  + tags = (known after apply) 2025-09-23 18:30:08.990942 | orchestrator | 18:30:08.990 STDOUT terraform:  + updated_at = (known after apply) 2025-09-23 18:30:08.990974 | orchestrator | 18:30:08.990 STDOUT terraform:  } 2025-09-23 18:30:08.991052 | orchestrator | 18:30:08.990 STDOUT terraform:  # data.openstack_images_image_v2.image_node will be read during apply 2025-09-23 18:30:08.991096 | orchestrator | 18:30:08.991 STDOUT terraform:  # (config refers to values not yet known) 2025-09-23 18:30:08.991114 | orchestrator | 18:30:08.991 STDOUT terraform:  <= data "openstack_images_image_v2" "image_node" { 2025-09-23 18:30:08.991144 | orchestrator | 18:30:08.991 STDOUT terraform:  + checksum = (known after apply) 2025-09-23 18:30:08.991198 | orchestrator | 18:30:08.991 STDOUT terraform:  + created_at = (known after apply) 2025-09-23 18:30:08.991241 | orchestrator | 18:30:08.991 STDOUT terraform:  + file = (known after apply) 2025-09-23 18:30:08.991287 | orchestrator | 18:30:08.991 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:08.991334 | orchestrator | 18:30:08.991 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:08.991368 | orchestrator | 18:30:08.991 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-09-23 18:30:08.991434 | orchestrator | 18:30:08.991 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-09-23 18:30:08.994105 | orchestrator | 18:30:08.991 STDOUT terraform:  + most_recent = true 2025-09-23 18:30:08.994112 | orchestrator | 18:30:08.991 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:08.994116 | orchestrator | 18:30:08.991 STDOUT terraform:  + protected = (known after apply) 2025-09-23 18:30:08.994120 | orchestrator | 18:30:08.991 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:08.994124 | orchestrator | 18:30:08.991 STDOUT terraform:  + schema = (known after apply) 2025-09-23 18:30:08.994128 | orchestrator | 18:30:08.992 STDOUT terraform:  + size_bytes = (known after apply) 2025-09-23 18:30:08.994131 | orchestrator | 18:30:08.992 STDOUT terraform:  + tags = (known after apply) 2025-09-23 18:30:08.994135 | orchestrator | 18:30:08.992 STDOUT terraform:  + updated_at = (known after apply) 2025-09-23 18:30:08.994139 | orchestrator | 18:30:08.992 STDOUT terraform:  } 2025-09-23 18:30:08.994143 | orchestrator | 18:30:08.992 STDOUT terraform:  # local_file.MANAGER_ADDRESS will be created 2025-09-23 18:30:08.994152 | orchestrator | 18:30:08.992 STDOUT terraform:  + resource "local_file" "MANAGER_ADDRESS" { 2025-09-23 18:30:08.994156 | orchestrator | 18:30:08.992 STDOUT terraform:  + content = (known after apply) 2025-09-23 18:30:08.994160 | orchestrator | 18:30:08.992 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-23 18:30:08.994164 | orchestrator | 18:30:08.992 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-23 18:30:08.994168 | orchestrator | 18:30:08.992 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-23 18:30:08.994172 | orchestrator | 18:30:08.992 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-23 18:30:08.994176 | orchestrator | 18:30:08.992 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-23 18:30:08.994179 | orchestrator | 18:30:08.992 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-23 18:30:08.994183 | orchestrator | 18:30:08.992 STDOUT terraform:  + directory_permission = "0777" 2025-09-23 18:30:08.994187 | orchestrator | 18:30:08.992 STDOUT terraform:  + file_permission = "0644" 2025-09-23 18:30:08.994191 | orchestrator | 18:30:08.993 STDOUT terraform:  + filename = ".MANAGER_ADDRESS.ci" 2025-09-23 18:30:08.994195 | orchestrator | 18:30:08.993 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:08.994199 | orchestrator | 18:30:08.993 STDOUT terraform:  } 2025-09-23 18:30:08.994380 | orchestrator | 18:30:08.994 STDOUT terraform:  # local_file.id_rsa_pub will be created 2025-09-23 18:30:08.994445 | orchestrator | 18:30:08.994 STDOUT terraform:  + resource "local_file" "id_rsa_pub" { 2025-09-23 18:30:08.996559 | orchestrator | 18:30:08.994 STDOUT terraform:  + content = (known after apply) 2025-09-23 18:30:08.996706 | orchestrator | 18:30:08.994 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-23 18:30:08.996718 | orchestrator | 18:30:08.994 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-23 18:30:08.996724 | orchestrator | 18:30:08.994 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-23 18:30:08.996729 | orchestrator | 18:30:08.994 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-23 18:30:08.996735 | orchestrator | 18:30:08.994 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-23 18:30:08.996740 | orchestrator | 18:30:08.994 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-23 18:30:08.996746 | orchestrator | 18:30:08.995 STDOUT terraform:  + directory_permission = "0777" 2025-09-23 18:30:08.996751 | orchestrator | 18:30:08.995 STDOUT terraform:  + file_permission = "0644" 2025-09-23 18:30:08.996757 | orchestrator | 18:30:08.995 STDOUT terraform:  + filename = ".id_rsa.ci.pub" 2025-09-23 18:30:08.996762 | orchestrator | 18:30:08.995 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:08.996768 | orchestrator | 18:30:08.995 STDOUT terraform:  } 2025-09-23 18:30:08.996781 | orchestrator | 18:30:08.995 STDOUT terraform:  # local_file.inventory will be created 2025-09-23 18:30:08.996787 | orchestrator | 18:30:08.995 STDOUT terraform:  + resource "local_file" "inventory" { 2025-09-23 18:30:08.996792 | orchestrator | 18:30:08.995 STDOUT terraform:  + content = (known after apply) 2025-09-23 18:30:08.996806 | orchestrator | 18:30:08.995 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-23 18:30:08.996812 | orchestrator | 18:30:08.995 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-23 18:30:08.996817 | orchestrator | 18:30:08.995 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-23 18:30:08.996823 | orchestrator | 18:30:08.995 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-23 18:30:08.996829 | orchestrator | 18:30:08.995 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-23 18:30:08.996834 | orchestrator | 18:30:08.996 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-23 18:30:08.996840 | orchestrator | 18:30:08.996 STDOUT terraform:  + directory_permission = "0777" 2025-09-23 18:30:08.996845 | orchestrator | 18:30:08.996 STDOUT terraform:  + file_permission = "0644" 2025-09-23 18:30:08.996851 | orchestrator | 18:30:08.996 STDOUT terraform:  + filename = "inventory.ci" 2025-09-23 18:30:08.996856 | orchestrator | 18:30:08.996 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:08.996862 | orchestrator | 18:30:08.996 STDOUT terraform:  } 2025-09-23 18:30:08.998060 | orchestrator | 18:30:08.996 STDOUT terraform:  # local_sensitive_file.id_rsa will be created 2025-09-23 18:30:08.998436 | orchestrator | 18:30:08.997 STDOUT terraform:  + resource "local_sensitive_file" "id_rsa" { 2025-09-23 18:30:08.998685 | orchestrator | 18:30:08.998 STDOUT terraform:  + content = (sensitive value) 2025-09-23 18:30:08.998943 | orchestrator | 18:30:08.998 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-23 18:30:08.999239 | orchestrator | 18:30:08.998 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-23 18:30:09.002187 | orchestrator | 18:30:08.999 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-23 18:30:09.002631 | orchestrator | 18:30:09.002 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-23 18:30:09.006062 | orchestrator | 18:30:09.002 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-23 18:30:09.006078 | orchestrator | 18:30:09.002 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-23 18:30:09.006083 | orchestrator | 18:30:09.002 STDOUT terraform:  + directory_permission = "0700" 2025-09-23 18:30:09.006087 | orchestrator | 18:30:09.002 STDOUT terraform:  + file_permission = "0600" 2025-09-23 18:30:09.006091 | orchestrator | 18:30:09.002 STDOUT terraform:  + filename = ".id_rsa.ci" 2025-09-23 18:30:09.006096 | orchestrator | 18:30:09.002 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006100 | orchestrator | 18:30:09.003 STDOUT terraform:  } 2025-09-23 18:30:09.006104 | orchestrator | 18:30:09.003 STDOUT terraform:  # null_resource.node_semaphore will be created 2025-09-23 18:30:09.006109 | orchestrator | 18:30:09.003 STDOUT terraform:  + resource "null_resource" "node_semaphore" { 2025-09-23 18:30:09.006113 | orchestrator | 18:30:09.003 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006117 | orchestrator | 18:30:09.003 STDOUT terraform:  } 2025-09-23 18:30:09.006122 | orchestrator | 18:30:09.003 STDOUT terraform:  # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2025-09-23 18:30:09.006138 | orchestrator | 18:30:09.003 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2025-09-23 18:30:09.006143 | orchestrator | 18:30:09.003 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006147 | orchestrator | 18:30:09.003 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006152 | orchestrator | 18:30:09.003 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006156 | orchestrator | 18:30:09.003 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006160 | orchestrator | 18:30:09.003 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006164 | orchestrator | 18:30:09.003 STDOUT terraform:  + name = "testbed-volume-manager-base" 2025-09-23 18:30:09.006168 | orchestrator | 18:30:09.003 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006172 | orchestrator | 18:30:09.003 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006177 | orchestrator | 18:30:09.003 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006181 | orchestrator | 18:30:09.003 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006185 | orchestrator | 18:30:09.003 STDOUT terraform:  } 2025-09-23 18:30:09.006189 | orchestrator | 18:30:09.003 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2025-09-23 18:30:09.006202 | orchestrator | 18:30:09.003 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-23 18:30:09.006207 | orchestrator | 18:30:09.003 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006211 | orchestrator | 18:30:09.003 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006215 | orchestrator | 18:30:09.003 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006219 | orchestrator | 18:30:09.003 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006223 | orchestrator | 18:30:09.003 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006228 | orchestrator | 18:30:09.003 STDOUT terraform:  + name = "testbed-volume-0-node-base" 2025-09-23 18:30:09.006232 | orchestrator | 18:30:09.003 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006236 | orchestrator | 18:30:09.004 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006240 | orchestrator | 18:30:09.004 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006244 | orchestrator | 18:30:09.004 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006249 | orchestrator | 18:30:09.004 STDOUT terraform:  } 2025-09-23 18:30:09.006260 | orchestrator | 18:30:09.004 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2025-09-23 18:30:09.006264 | orchestrator | 18:30:09.004 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-23 18:30:09.006268 | orchestrator | 18:30:09.004 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006276 | orchestrator | 18:30:09.004 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006280 | orchestrator | 18:30:09.004 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006284 | orchestrator | 18:30:09.004 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006288 | orchestrator | 18:30:09.004 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006293 | orchestrator | 18:30:09.004 STDOUT terraform:  + name = "testbed-volume-1-node-base" 2025-09-23 18:30:09.006297 | orchestrator | 18:30:09.004 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006301 | orchestrator | 18:30:09.004 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006305 | orchestrator | 18:30:09.004 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006309 | orchestrator | 18:30:09.004 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006314 | orchestrator | 18:30:09.004 STDOUT terraform:  } 2025-09-23 18:30:09.006318 | orchestrator | 18:30:09.004 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2025-09-23 18:30:09.006322 | orchestrator | 18:30:09.004 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-23 18:30:09.006326 | orchestrator | 18:30:09.004 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006332 | orchestrator | 18:30:09.004 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006337 | orchestrator | 18:30:09.004 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006341 | orchestrator | 18:30:09.004 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006345 | orchestrator | 18:30:09.004 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006349 | orchestrator | 18:30:09.004 STDOUT terraform:  + name = "testbed-volume-2-node-base" 2025-09-23 18:30:09.006353 | orchestrator | 18:30:09.004 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006357 | orchestrator | 18:30:09.004 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006362 | orchestrator | 18:30:09.004 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006366 | orchestrator | 18:30:09.004 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006370 | orchestrator | 18:30:09.004 STDOUT terraform:  } 2025-09-23 18:30:09.006374 | orchestrator | 18:30:09.004 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2025-09-23 18:30:09.006378 | orchestrator | 18:30:09.005 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-23 18:30:09.006382 | orchestrator | 18:30:09.005 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006386 | orchestrator | 18:30:09.005 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006391 | orchestrator | 18:30:09.005 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006395 | orchestrator | 18:30:09.005 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006399 | orchestrator | 18:30:09.005 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006406 | orchestrator | 18:30:09.005 STDOUT terraform:  + name = "testbed-volume-3-node-base" 2025-09-23 18:30:09.006410 | orchestrator | 18:30:09.005 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006414 | orchestrator | 18:30:09.005 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006418 | orchestrator | 18:30:09.005 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006425 | orchestrator | 18:30:09.005 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006429 | orchestrator | 18:30:09.005 STDOUT terraform:  } 2025-09-23 18:30:09.006434 | orchestrator | 18:30:09.005 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2025-09-23 18:30:09.006440 | orchestrator | 18:30:09.005 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-23 18:30:09.006444 | orchestrator | 18:30:09.005 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006448 | orchestrator | 18:30:09.005 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006453 | orchestrator | 18:30:09.005 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006457 | orchestrator | 18:30:09.005 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006461 | orchestrator | 18:30:09.005 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006465 | orchestrator | 18:30:09.005 STDOUT terraform:  + name = "testbed-volume-4-node-base" 2025-09-23 18:30:09.006469 | orchestrator | 18:30:09.005 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006473 | orchestrator | 18:30:09.005 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006477 | orchestrator | 18:30:09.005 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006482 | orchestrator | 18:30:09.005 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006486 | orchestrator | 18:30:09.005 STDOUT terraform:  } 2025-09-23 18:30:09.006490 | orchestrator | 18:30:09.005 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2025-09-23 18:30:09.006494 | orchestrator | 18:30:09.005 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-23 18:30:09.006498 | orchestrator | 18:30:09.005 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.006502 | orchestrator | 18:30:09.005 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.006507 | orchestrator | 18:30:09.005 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.006511 | orchestrator | 18:30:09.005 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.006567 | orchestrator | 18:30:09.006 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.006637 | orchestrator | 18:30:09.006 STDOUT terraform:  + name = "testbed-volume-5-node-base" 2025-09-23 18:30:09.006691 | orchestrator | 18:30:09.006 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.006725 | orchestrator | 18:30:09.006 STDOUT terraform:  + size = 80 2025-09-23 18:30:09.006769 | orchestrator | 18:30:09.006 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.006805 | orchestrator | 18:30:09.006 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.006832 | orchestrator | 18:30:09.006 STDOUT terraform:  } 2025-09-23 18:30:09.006890 | orchestrator | 18:30:09.006 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[0] will be created 2025-09-23 18:30:09.006949 | orchestrator | 18:30:09.006 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.007215 | orchestrator | 18:30:09.006 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.007268 | orchestrator | 18:30:09.007 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.007319 | orchestrator | 18:30:09.007 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.007419 | orchestrator | 18:30:09.007 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.007477 | orchestrator | 18:30:09.007 STDOUT terraform:  + name = "testbed-volume-0-node-3" 2025-09-23 18:30:09.007537 | orchestrator | 18:30:09.007 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.007570 | orchestrator | 18:30:09.007 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.007606 | orchestrator | 18:30:09.007 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.007673 | orchestrator | 18:30:09.007 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.007699 | orchestrator | 18:30:09.007 STDOUT terraform:  } 2025-09-23 18:30:09.007759 | orchestrator | 18:30:09.007 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[1] will be created 2025-09-23 18:30:09.007815 | orchestrator | 18:30:09.007 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.007864 | orchestrator | 18:30:09.007 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.007898 | orchestrator | 18:30:09.007 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.007950 | orchestrator | 18:30:09.007 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.007998 | orchestrator | 18:30:09.007 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.008050 | orchestrator | 18:30:09.008 STDOUT terraform:  + name = "testbed-volume-1-node-4" 2025-09-23 18:30:09.008097 | orchestrator | 18:30:09.008 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.008132 | orchestrator | 18:30:09.008 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.008167 | orchestrator | 18:30:09.008 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.008203 | orchestrator | 18:30:09.008 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.008225 | orchestrator | 18:30:09.008 STDOUT terraform:  } 2025-09-23 18:30:09.008278 | orchestrator | 18:30:09.008 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[2] will be created 2025-09-23 18:30:09.008385 | orchestrator | 18:30:09.008 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.008438 | orchestrator | 18:30:09.008 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.008481 | orchestrator | 18:30:09.008 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.008594 | orchestrator | 18:30:09.008 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.008672 | orchestrator | 18:30:09.008 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.008721 | orchestrator | 18:30:09.008 STDOUT terraform:  + name = "testbed-volume-2-node-5" 2025-09-23 18:30:09.008767 | orchestrator | 18:30:09.008 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.008797 | orchestrator | 18:30:09.008 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.008836 | orchestrator | 18:30:09.008 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.008869 | orchestrator | 18:30:09.008 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.008893 | orchestrator | 18:30:09.008 STDOUT terraform:  } 2025-09-23 18:30:09.008944 | orchestrator | 18:30:09.008 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[3] will be created 2025-09-23 18:30:09.008997 | orchestrator | 18:30:09.008 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.009040 | orchestrator | 18:30:09.009 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.009073 | orchestrator | 18:30:09.009 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.009116 | orchestrator | 18:30:09.009 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.009160 | orchestrator | 18:30:09.009 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.009206 | orchestrator | 18:30:09.009 STDOUT terraform:  + name = "testbed-volume-3-node-3" 2025-09-23 18:30:09.009249 | orchestrator | 18:30:09.009 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.009280 | orchestrator | 18:30:09.009 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.009311 | orchestrator | 18:30:09.009 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.009345 | orchestrator | 18:30:09.009 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.009367 | orchestrator | 18:30:09.009 STDOUT terraform:  } 2025-09-23 18:30:09.009421 | orchestrator | 18:30:09.009 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[4] will be created 2025-09-23 18:30:09.009471 | orchestrator | 18:30:09.009 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.009517 | orchestrator | 18:30:09.009 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.009549 | orchestrator | 18:30:09.009 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.009611 | orchestrator | 18:30:09.009 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.009664 | orchestrator | 18:30:09.009 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.009712 | orchestrator | 18:30:09.009 STDOUT terraform:  + name = "testbed-volume-4-node-4" 2025-09-23 18:30:09.009774 | orchestrator | 18:30:09.009 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.009811 | orchestrator | 18:30:09.009 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.009843 | orchestrator | 18:30:09.009 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.009878 | orchestrator | 18:30:09.009 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.009900 | orchestrator | 18:30:09.009 STDOUT terraform:  } 2025-09-23 18:30:09.009952 | orchestrator | 18:30:09.009 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[5] will be created 2025-09-23 18:30:09.010005 | orchestrator | 18:30:09.009 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.010060 | orchestrator | 18:30:09.010 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.010095 | orchestrator | 18:30:09.010 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.010138 | orchestrator | 18:30:09.010 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.010182 | orchestrator | 18:30:09.010 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.010226 | orchestrator | 18:30:09.010 STDOUT terraform:  + name = "testbed-volume-5-node-5" 2025-09-23 18:30:09.010272 | orchestrator | 18:30:09.010 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.010303 | orchestrator | 18:30:09.010 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.010338 | orchestrator | 18:30:09.010 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.010369 | orchestrator | 18:30:09.010 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.010394 | orchestrator | 18:30:09.010 STDOUT terraform:  } 2025-09-23 18:30:09.010445 | orchestrator | 18:30:09.010 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[6] will be created 2025-09-23 18:30:09.010497 | orchestrator | 18:30:09.010 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.010539 | orchestrator | 18:30:09.010 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.010572 | orchestrator | 18:30:09.010 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.010615 | orchestrator | 18:30:09.010 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.010687 | orchestrator | 18:30:09.010 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.010735 | orchestrator | 18:30:09.010 STDOUT terraform:  + name = "testbed-volume-6-node-3" 2025-09-23 18:30:09.010781 | orchestrator | 18:30:09.010 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.010810 | orchestrator | 18:30:09.010 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.010860 | orchestrator | 18:30:09.010 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.010898 | orchestrator | 18:30:09.010 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.010921 | orchestrator | 18:30:09.010 STDOUT terraform:  } 2025-09-23 18:30:09.010976 | orchestrator | 18:30:09.010 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[7] will be created 2025-09-23 18:30:09.011040 | orchestrator | 18:30:09.010 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.011093 | orchestrator | 18:30:09.011 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.011126 | orchestrator | 18:30:09.011 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.012401 | orchestrator | 18:30:09.011 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.012481 | orchestrator | 18:30:09.012 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.012544 | orchestrator | 18:30:09.012 STDOUT terraform:  + name = "testbed-volume-7-node-4" 2025-09-23 18:30:09.012589 | orchestrator | 18:30:09.012 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.012648 | orchestrator | 18:30:09.012 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.012685 | orchestrator | 18:30:09.012 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.012720 | orchestrator | 18:30:09.012 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.012742 | orchestrator | 18:30:09.012 STDOUT terraform:  } 2025-09-23 18:30:09.012796 | orchestrator | 18:30:09.012 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[8] will be created 2025-09-23 18:30:09.012846 | orchestrator | 18:30:09.012 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-23 18:30:09.012893 | orchestrator | 18:30:09.012 STDOUT terraform:  + attachment = (known after apply) 2025-09-23 18:30:09.012924 | orchestrator | 18:30:09.012 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.012971 | orchestrator | 18:30:09.012 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.013013 | orchestrator | 18:30:09.012 STDOUT terraform:  + metadata = (known after apply) 2025-09-23 18:30:09.013065 | orchestrator | 18:30:09.013 STDOUT terraform:  + name = "testbed-volume-8-node-5" 2025-09-23 18:30:09.013111 | orchestrator | 18:30:09.013 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.013518 | orchestrator | 18:30:09.013 STDOUT terraform:  + size = 20 2025-09-23 18:30:09.013563 | orchestrator | 18:30:09.013 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-23 18:30:09.013597 | orchestrator | 18:30:09.013 STDOUT terraform:  + volume_type = "ssd" 2025-09-23 18:30:09.013634 | orchestrator | 18:30:09.013 STDOUT terraform:  } 2025-09-23 18:30:09.013704 | orchestrator | 18:30:09.013 STDOUT terraform:  # openstack_compute_instance_v2.manager_server will be created 2025-09-23 18:30:09.013759 | orchestrator | 18:30:09.013 STDOUT terraform:  + resource "openstack_compute_instance_v2" "manager_server" { 2025-09-23 18:30:09.013804 | orchestrator | 18:30:09.013 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.013862 | orchestrator | 18:30:09.013 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.013904 | orchestrator | 18:30:09.013 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.014147 | orchestrator | 18:30:09.014 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.014193 | orchestrator | 18:30:09.014 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.014237 | orchestrator | 18:30:09.014 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.014280 | orchestrator | 18:30:09.014 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.014321 | orchestrator | 18:30:09.014 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.014365 | orchestrator | 18:30:09.014 STDOUT terraform:  + flavor_name = "OSISM-4V-16" 2025-09-23 18:30:09.014404 | orchestrator | 18:30:09.014 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.014896 | orchestrator | 18:30:09.014 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.014940 | orchestrator | 18:30:09.014 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.014981 | orchestrator | 18:30:09.014 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.015023 | orchestrator | 18:30:09.014 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.016366 | orchestrator | 18:30:09.015 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.016441 | orchestrator | 18:30:09.016 STDOUT terraform:  + name = "testbed-manager" 2025-09-23 18:30:09.016476 | orchestrator | 18:30:09.016 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.016518 | orchestrator | 18:30:09.016 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.016560 | orchestrator | 18:30:09.016 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.016591 | orchestrator | 18:30:09.016 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.016642 | orchestrator | 18:30:09.016 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.016681 | orchestrator | 18:30:09.016 STDOUT terraform:  + user_data = (sensitive value) 2025-09-23 18:30:09.016705 | orchestrator | 18:30:09.016 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.016735 | orchestrator | 18:30:09.016 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.016768 | orchestrator | 18:30:09.016 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.016804 | orchestrator | 18:30:09.016 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.016838 | orchestrator | 18:30:09.016 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.016874 | orchestrator | 18:30:09.016 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.016927 | orchestrator | 18:30:09.016 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.016950 | orchestrator | 18:30:09.016 STDOUT terraform:  } 2025-09-23 18:30:09.016970 | orchestrator | 18:30:09.016 STDOUT terraform:  + network { 2025-09-23 18:30:09.016997 | orchestrator | 18:30:09.016 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.017033 | orchestrator | 18:30:09.017 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.017069 | orchestrator | 18:30:09.017 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.017106 | orchestrator | 18:30:09.017 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.017150 | orchestrator | 18:30:09.017 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.017188 | orchestrator | 18:30:09.017 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.017224 | orchestrator | 18:30:09.017 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.017243 | orchestrator | 18:30:09.017 STDOUT terraform:  } 2025-09-23 18:30:09.017263 | orchestrator | 18:30:09.017 STDOUT terraform:  } 2025-09-23 18:30:09.017367 | orchestrator | 18:30:09.017 STDOUT terraform:  # openstack_compute_instance_v2.node_server[0] will be created 2025-09-23 18:30:09.017418 | orchestrator | 18:30:09.017 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-23 18:30:09.017458 | orchestrator | 18:30:09.017 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.017501 | orchestrator | 18:30:09.017 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.017543 | orchestrator | 18:30:09.017 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.017582 | orchestrator | 18:30:09.017 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.017613 | orchestrator | 18:30:09.017 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.017651 | orchestrator | 18:30:09.017 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.017693 | orchestrator | 18:30:09.017 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.017734 | orchestrator | 18:30:09.017 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.017769 | orchestrator | 18:30:09.017 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-23 18:30:09.017800 | orchestrator | 18:30:09.017 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.017841 | orchestrator | 18:30:09.017 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.017883 | orchestrator | 18:30:09.017 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.017924 | orchestrator | 18:30:09.017 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.017965 | orchestrator | 18:30:09.017 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.017996 | orchestrator | 18:30:09.017 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.018053 | orchestrator | 18:30:09.018 STDOUT terraform:  + name = "testbed-node-0" 2025-09-23 18:30:09.018087 | orchestrator | 18:30:09.018 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.018127 | orchestrator | 18:30:09.018 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.018167 | orchestrator | 18:30:09.018 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.018196 | orchestrator | 18:30:09.018 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.018237 | orchestrator | 18:30:09.018 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.018292 | orchestrator | 18:30:09.018 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-23 18:30:09.018315 | orchestrator | 18:30:09.018 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.018350 | orchestrator | 18:30:09.018 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.018383 | orchestrator | 18:30:09.018 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.018419 | orchestrator | 18:30:09.018 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.018452 | orchestrator | 18:30:09.018 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.018487 | orchestrator | 18:30:09.018 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.018529 | orchestrator | 18:30:09.018 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.018549 | orchestrator | 18:30:09.018 STDOUT terraform:  } 2025-09-23 18:30:09.018571 | orchestrator | 18:30:09.018 STDOUT terraform:  + network { 2025-09-23 18:30:09.018597 | orchestrator | 18:30:09.018 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.018661 | orchestrator | 18:30:09.018 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.018701 | orchestrator | 18:30:09.018 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.018739 | orchestrator | 18:30:09.018 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.018777 | orchestrator | 18:30:09.018 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.018816 | orchestrator | 18:30:09.018 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.018854 | orchestrator | 18:30:09.018 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.018876 | orchestrator | 18:30:09.018 STDOUT terraform:  } 2025-09-23 18:30:09.018898 | orchestrator | 18:30:09.018 STDOUT terraform:  } 2025-09-23 18:30:09.018947 | orchestrator | 18:30:09.018 STDOUT terraform:  # openstack_compute_instance_v2.node_server[1] will be created 2025-09-23 18:30:09.018996 | orchestrator | 18:30:09.018 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-23 18:30:09.019037 | orchestrator | 18:30:09.019 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.019078 | orchestrator | 18:30:09.019 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.019119 | orchestrator | 18:30:09.019 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.019161 | orchestrator | 18:30:09.019 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.019191 | orchestrator | 18:30:09.019 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.019219 | orchestrator | 18:30:09.019 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.019260 | orchestrator | 18:30:09.019 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.019302 | orchestrator | 18:30:09.019 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.019338 | orchestrator | 18:30:09.019 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-23 18:30:09.019369 | orchestrator | 18:30:09.019 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.019412 | orchestrator | 18:30:09.019 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.019461 | orchestrator | 18:30:09.019 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.019503 | orchestrator | 18:30:09.019 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.019544 | orchestrator | 18:30:09.019 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.019578 | orchestrator | 18:30:09.019 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.019616 | orchestrator | 18:30:09.019 STDOUT terraform:  + name = "testbed-node-1" 2025-09-23 18:30:09.019656 | orchestrator | 18:30:09.019 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.019697 | orchestrator | 18:30:09.019 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.019737 | orchestrator | 18:30:09.019 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.019767 | orchestrator | 18:30:09.019 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.019808 | orchestrator | 18:30:09.019 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.019863 | orchestrator | 18:30:09.019 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-23 18:30:09.019888 | orchestrator | 18:30:09.019 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.019919 | orchestrator | 18:30:09.019 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.019956 | orchestrator | 18:30:09.019 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.019991 | orchestrator | 18:30:09.019 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.020025 | orchestrator | 18:30:09.019 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.020062 | orchestrator | 18:30:09.020 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.020105 | orchestrator | 18:30:09.020 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.020128 | orchestrator | 18:30:09.020 STDOUT terraform:  } 2025-09-23 18:30:09.020151 | orchestrator | 18:30:09.020 STDOUT terraform:  + network { 2025-09-23 18:30:09.020178 | orchestrator | 18:30:09.020 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.020217 | orchestrator | 18:30:09.020 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.020253 | orchestrator | 18:30:09.020 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.020290 | orchestrator | 18:30:09.020 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.020327 | orchestrator | 18:30:09.020 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.020364 | orchestrator | 18:30:09.020 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.020402 | orchestrator | 18:30:09.020 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.020422 | orchestrator | 18:30:09.020 STDOUT terraform:  } 2025-09-23 18:30:09.020442 | orchestrator | 18:30:09.020 STDOUT terraform:  } 2025-09-23 18:30:09.020491 | orchestrator | 18:30:09.020 STDOUT terraform:  # openstack_compute_instance_v2.node_server[2] will be created 2025-09-23 18:30:09.020538 | orchestrator | 18:30:09.020 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-23 18:30:09.020583 | orchestrator | 18:30:09.020 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.020632 | orchestrator | 18:30:09.020 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.020673 | orchestrator | 18:30:09.020 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.020714 | orchestrator | 18:30:09.020 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.020743 | orchestrator | 18:30:09.020 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.020769 | orchestrator | 18:30:09.020 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.020810 | orchestrator | 18:30:09.020 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.020850 | orchestrator | 18:30:09.020 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.020884 | orchestrator | 18:30:09.020 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-23 18:30:09.020913 | orchestrator | 18:30:09.020 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.020951 | orchestrator | 18:30:09.020 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.020991 | orchestrator | 18:30:09.020 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.021030 | orchestrator | 18:30:09.020 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.021071 | orchestrator | 18:30:09.021 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.021101 | orchestrator | 18:30:09.021 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.021136 | orchestrator | 18:30:09.021 STDOUT terraform:  + name = "testbed-node-2" 2025-09-23 18:30:09.021166 | orchestrator | 18:30:09.021 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.021206 | orchestrator | 18:30:09.021 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.021249 | orchestrator | 18:30:09.021 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.021279 | orchestrator | 18:30:09.021 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.021319 | orchestrator | 18:30:09.021 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.021373 | orchestrator | 18:30:09.021 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-23 18:30:09.021397 | orchestrator | 18:30:09.021 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.021426 | orchestrator | 18:30:09.021 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.021459 | orchestrator | 18:30:09.021 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.021496 | orchestrator | 18:30:09.021 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.021539 | orchestrator | 18:30:09.021 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.021578 | orchestrator | 18:30:09.021 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.021643 | orchestrator | 18:30:09.021 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.021673 | orchestrator | 18:30:09.021 STDOUT terraform:  } 2025-09-23 18:30:09.021695 | orchestrator | 18:30:09.021 STDOUT terraform:  + network { 2025-09-23 18:30:09.021722 | orchestrator | 18:30:09.021 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.021759 | orchestrator | 18:30:09.021 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.021795 | orchestrator | 18:30:09.021 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.021834 | orchestrator | 18:30:09.021 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.021870 | orchestrator | 18:30:09.021 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.021907 | orchestrator | 18:30:09.021 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.021944 | orchestrator | 18:30:09.021 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.021964 | orchestrator | 18:30:09.021 STDOUT terraform:  } 2025-09-23 18:30:09.021986 | orchestrator | 18:30:09.021 STDOUT terraform:  } 2025-09-23 18:30:09.022047 | orchestrator | 18:30:09.021 STDOUT terraform:  # openstack_compute_instance_v2.node_server[3] will be created 2025-09-23 18:30:09.022098 | orchestrator | 18:30:09.022 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-23 18:30:09.022142 | orchestrator | 18:30:09.022 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.022184 | orchestrator | 18:30:09.022 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.022226 | orchestrator | 18:30:09.022 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.022267 | orchestrator | 18:30:09.022 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.022298 | orchestrator | 18:30:09.022 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.022326 | orchestrator | 18:30:09.022 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.022368 | orchestrator | 18:30:09.022 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.022409 | orchestrator | 18:30:09.022 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.022445 | orchestrator | 18:30:09.022 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-23 18:30:09.022476 | orchestrator | 18:30:09.022 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.022516 | orchestrator | 18:30:09.022 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.022561 | orchestrator | 18:30:09.022 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.022603 | orchestrator | 18:30:09.022 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.022654 | orchestrator | 18:30:09.022 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.022687 | orchestrator | 18:30:09.022 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.022726 | orchestrator | 18:30:09.022 STDOUT terraform:  + name = "testbed-node-3" 2025-09-23 18:30:09.022758 | orchestrator | 18:30:09.022 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.022804 | orchestrator | 18:30:09.022 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.022845 | orchestrator | 18:30:09.022 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.022875 | orchestrator | 18:30:09.022 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.022917 | orchestrator | 18:30:09.022 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.022971 | orchestrator | 18:30:09.022 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-23 18:30:09.022996 | orchestrator | 18:30:09.022 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.023028 | orchestrator | 18:30:09.023 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.023062 | orchestrator | 18:30:09.023 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.023098 | orchestrator | 18:30:09.023 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.023133 | orchestrator | 18:30:09.023 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.023168 | orchestrator | 18:30:09.023 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.023214 | orchestrator | 18:30:09.023 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.023236 | orchestrator | 18:30:09.023 STDOUT terraform:  } 2025-09-23 18:30:09.023259 | orchestrator | 18:30:09.023 STDOUT terraform:  + network { 2025-09-23 18:30:09.023287 | orchestrator | 18:30:09.023 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.023325 | orchestrator | 18:30:09.023 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.023362 | orchestrator | 18:30:09.023 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.023400 | orchestrator | 18:30:09.023 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.023438 | orchestrator | 18:30:09.023 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.023476 | orchestrator | 18:30:09.023 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.023514 | orchestrator | 18:30:09.023 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.023535 | orchestrator | 18:30:09.023 STDOUT terraform:  } 2025-09-23 18:30:09.023557 | orchestrator | 18:30:09.023 STDOUT terraform:  } 2025-09-23 18:30:09.023607 | orchestrator | 18:30:09.023 STDOUT terraform:  # openstack_compute_instance_v2.node_server[4] will be created 2025-09-23 18:30:09.023664 | orchestrator | 18:30:09.023 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-23 18:30:09.023706 | orchestrator | 18:30:09.023 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.023749 | orchestrator | 18:30:09.023 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.023791 | orchestrator | 18:30:09.023 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.023835 | orchestrator | 18:30:09.023 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.023867 | orchestrator | 18:30:09.023 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.023954 | orchestrator | 18:30:09.023 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.024006 | orchestrator | 18:30:09.023 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.024050 | orchestrator | 18:30:09.024 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.024086 | orchestrator | 18:30:09.024 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-23 18:30:09.024116 | orchestrator | 18:30:09.024 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.024157 | orchestrator | 18:30:09.024 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.024199 | orchestrator | 18:30:09.024 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.024240 | orchestrator | 18:30:09.024 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.024283 | orchestrator | 18:30:09.024 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.024316 | orchestrator | 18:30:09.024 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.024354 | orchestrator | 18:30:09.024 STDOUT terraform:  + name = "testbed-node-4" 2025-09-23 18:30:09.024386 | orchestrator | 18:30:09.024 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.024430 | orchestrator | 18:30:09.024 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.024471 | orchestrator | 18:30:09.024 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.024502 | orchestrator | 18:30:09.024 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.024544 | orchestrator | 18:30:09.024 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.024601 | orchestrator | 18:30:09.024 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-23 18:30:09.024633 | orchestrator | 18:30:09.024 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.024667 | orchestrator | 18:30:09.024 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.024703 | orchestrator | 18:30:09.024 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.024739 | orchestrator | 18:30:09.024 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.024774 | orchestrator | 18:30:09.024 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.024811 | orchestrator | 18:30:09.024 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.024856 | orchestrator | 18:30:09.024 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.024877 | orchestrator | 18:30:09.024 STDOUT terraform:  } 2025-09-23 18:30:09.024899 | orchestrator | 18:30:09.024 STDOUT terraform:  + network { 2025-09-23 18:30:09.024927 | orchestrator | 18:30:09.024 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.024966 | orchestrator | 18:30:09.024 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.025003 | orchestrator | 18:30:09.024 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.025043 | orchestrator | 18:30:09.025 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.025086 | orchestrator | 18:30:09.025 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.025124 | orchestrator | 18:30:09.025 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.025162 | orchestrator | 18:30:09.025 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.025183 | orchestrator | 18:30:09.025 STDOUT terraform:  } 2025-09-23 18:30:09.025204 | orchestrator | 18:30:09.025 STDOUT terraform:  } 2025-09-23 18:30:09.025254 | orchestrator | 18:30:09.025 STDOUT terraform:  # openstack_compute_instance_v2.node_server[5] will be created 2025-09-23 18:30:09.025302 | orchestrator | 18:30:09.025 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-23 18:30:09.025345 | orchestrator | 18:30:09.025 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-23 18:30:09.025387 | orchestrator | 18:30:09.025 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-23 18:30:09.025428 | orchestrator | 18:30:09.025 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-23 18:30:09.025470 | orchestrator | 18:30:09.025 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.025500 | orchestrator | 18:30:09.025 STDOUT terraform:  + availability_zone = "nova" 2025-09-23 18:30:09.025528 | orchestrator | 18:30:09.025 STDOUT terraform:  + config_drive = true 2025-09-23 18:30:09.025570 | orchestrator | 18:30:09.025 STDOUT terraform:  + created = (known after apply) 2025-09-23 18:30:09.025611 | orchestrator | 18:30:09.025 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-23 18:30:09.025671 | orchestrator | 18:30:09.025 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-23 18:30:09.025703 | orchestrator | 18:30:09.025 STDOUT terraform:  + force_delete = false 2025-09-23 18:30:09.025743 | orchestrator | 18:30:09.025 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-23 18:30:09.025786 | orchestrator | 18:30:09.025 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.025826 | orchestrator | 18:30:09.025 STDOUT terraform:  + image_id = (known after apply) 2025-09-23 18:30:09.025867 | orchestrator | 18:30:09.025 STDOUT terraform:  + image_name = (known after apply) 2025-09-23 18:30:09.025898 | orchestrator | 18:30:09.025 STDOUT terraform:  + key_pair = "testbed" 2025-09-23 18:30:09.025936 | orchestrator | 18:30:09.025 STDOUT terraform:  + name = "testbed-node-5" 2025-09-23 18:30:09.025967 | orchestrator | 18:30:09.025 STDOUT terraform:  + power_state = "active" 2025-09-23 18:30:09.026008 | orchestrator | 18:30:09.025 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.026066 | orchestrator | 18:30:09.026 STDOUT terraform:  + security_groups = (known after apply) 2025-09-23 18:30:09.026097 | orchestrator | 18:30:09.026 STDOUT terraform:  + stop_before_destroy = false 2025-09-23 18:30:09.026139 | orchestrator | 18:30:09.026 STDOUT terraform:  + updated = (known after apply) 2025-09-23 18:30:09.026195 | orchestrator | 18:30:09.026 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-23 18:30:09.026218 | orchestrator | 18:30:09.026 STDOUT terraform:  + block_device { 2025-09-23 18:30:09.026253 | orchestrator | 18:30:09.026 STDOUT terraform:  + boot_index = 0 2025-09-23 18:30:09.026287 | orchestrator | 18:30:09.026 STDOUT terraform:  + delete_on_termination = false 2025-09-23 18:30:09.026322 | orchestrator | 18:30:09.026 STDOUT terraform:  + destination_type = "volume" 2025-09-23 18:30:09.026356 | orchestrator | 18:30:09.026 STDOUT terraform:  + multiattach = false 2025-09-23 18:30:09.026391 | orchestrator | 18:30:09.026 STDOUT terraform:  + source_type = "volume" 2025-09-23 18:30:09.026435 | orchestrator | 18:30:09.026 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.026456 | orchestrator | 18:30:09.026 STDOUT terraform:  } 2025-09-23 18:30:09.026478 | orchestrator | 18:30:09.026 STDOUT terraform:  + network { 2025-09-23 18:30:09.026505 | orchestrator | 18:30:09.026 STDOUT terraform:  + access_network = false 2025-09-23 18:30:09.026542 | orchestrator | 18:30:09.026 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-23 18:30:09.026578 | orchestrator | 18:30:09.026 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-23 18:30:09.026615 | orchestrator | 18:30:09.026 STDOUT terraform:  + mac = (known after apply) 2025-09-23 18:30:09.026661 | orchestrator | 18:30:09.026 STDOUT terraform:  + name = (known after apply) 2025-09-23 18:30:09.026701 | orchestrator | 18:30:09.026 STDOUT terraform:  + port = (known after apply) 2025-09-23 18:30:09.026739 | orchestrator | 18:30:09.026 STDOUT terraform:  + uuid = (known after apply) 2025-09-23 18:30:09.026759 | orchestrator | 18:30:09.026 STDOUT terraform:  } 2025-09-23 18:30:09.026782 | orchestrator | 18:30:09.026 STDOUT terraform:  } 2025-09-23 18:30:09.026823 | orchestrator | 18:30:09.026 STDOUT terraform:  # openstack_compute_keypair_v2.key will be created 2025-09-23 18:30:09.026864 | orchestrator | 18:30:09.026 STDOUT terraform:  + resource "openstack_compute_keypair_v2" "key" { 2025-09-23 18:30:09.026900 | orchestrator | 18:30:09.026 STDOUT terraform:  + fingerprint = (known after apply) 2025-09-23 18:30:09.026936 | orchestrator | 18:30:09.026 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.026964 | orchestrator | 18:30:09.026 STDOUT terraform:  + name = "testbed" 2025-09-23 18:30:09.026995 | orchestrator | 18:30:09.026 STDOUT terraform:  + private_key = (sensitive value) 2025-09-23 18:30:09.027030 | orchestrator | 18:30:09.027 STDOUT terraform:  + public_key = (known after apply) 2025-09-23 18:30:09.027065 | orchestrator | 18:30:09.027 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.027103 | orchestrator | 18:30:09.027 STDOUT terraform:  + user_id = (known after apply) 2025-09-23 18:30:09.027124 | orchestrator | 18:30:09.027 STDOUT terraform:  } 2025-09-23 18:30:09.027179 | orchestrator | 18:30:09.027 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2025-09-23 18:30:09.027234 | orchestrator | 18:30:09.027 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.027268 | orchestrator | 18:30:09.027 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.027303 | orchestrator | 18:30:09.027 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.027341 | orchestrator | 18:30:09.027 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.027376 | orchestrator | 18:30:09.027 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.027410 | orchestrator | 18:30:09.027 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.027431 | orchestrator | 18:30:09.027 STDOUT terraform:  } 2025-09-23 18:30:09.027486 | orchestrator | 18:30:09.027 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2025-09-23 18:30:09.027539 | orchestrator | 18:30:09.027 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.027573 | orchestrator | 18:30:09.027 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.027608 | orchestrator | 18:30:09.027 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.027651 | orchestrator | 18:30:09.027 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.027688 | orchestrator | 18:30:09.027 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.027723 | orchestrator | 18:30:09.027 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.027744 | orchestrator | 18:30:09.027 STDOUT terraform:  } 2025-09-23 18:30:09.027800 | orchestrator | 18:30:09.027 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2025-09-23 18:30:09.027856 | orchestrator | 18:30:09.027 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.027891 | orchestrator | 18:30:09.027 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.027927 | orchestrator | 18:30:09.027 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.027964 | orchestrator | 18:30:09.027 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.028000 | orchestrator | 18:30:09.027 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.028036 | orchestrator | 18:30:09.028 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.028058 | orchestrator | 18:30:09.028 STDOUT terraform:  } 2025-09-23 18:30:09.028116 | orchestrator | 18:30:09.028 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2025-09-23 18:30:09.028170 | orchestrator | 18:30:09.028 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.028207 | orchestrator | 18:30:09.028 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.028243 | orchestrator | 18:30:09.028 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.028278 | orchestrator | 18:30:09.028 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.028313 | orchestrator | 18:30:09.028 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.028349 | orchestrator | 18:30:09.028 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.028370 | orchestrator | 18:30:09.028 STDOUT terraform:  } 2025-09-23 18:30:09.028425 | orchestrator | 18:30:09.028 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2025-09-23 18:30:09.028484 | orchestrator | 18:30:09.028 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.028520 | orchestrator | 18:30:09.028 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.028558 | orchestrator | 18:30:09.028 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.028593 | orchestrator | 18:30:09.028 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.028636 | orchestrator | 18:30:09.028 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.028673 | orchestrator | 18:30:09.028 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.028695 | orchestrator | 18:30:09.028 STDOUT terraform:  } 2025-09-23 18:30:09.028751 | orchestrator | 18:30:09.028 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2025-09-23 18:30:09.029387 | orchestrator | 18:30:09.028 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.029492 | orchestrator | 18:30:09.029 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.029571 | orchestrator | 18:30:09.029 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.029648 | orchestrator | 18:30:09.029 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.029736 | orchestrator | 18:30:09.029 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.029814 | orchestrator | 18:30:09.029 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.029862 | orchestrator | 18:30:09.029 STDOUT terraform:  } 2025-09-23 18:30:09.029998 | orchestrator | 18:30:09.029 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2025-09-23 18:30:09.030162 | orchestrator | 18:30:09.030 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.030254 | orchestrator | 18:30:09.030 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.030344 | orchestrator | 18:30:09.030 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.030434 | orchestrator | 18:30:09.030 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.030519 | orchestrator | 18:30:09.030 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.030616 | orchestrator | 18:30:09.030 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.030691 | orchestrator | 18:30:09.030 STDOUT terraform:  } 2025-09-23 18:30:09.030830 | orchestrator | 18:30:09.030 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2025-09-23 18:30:09.030967 | orchestrator | 18:30:09.030 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.031054 | orchestrator | 18:30:09.030 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.031143 | orchestrator | 18:30:09.031 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.031227 | orchestrator | 18:30:09.031 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.031317 | orchestrator | 18:30:09.031 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.031420 | orchestrator | 18:30:09.031 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.031467 | orchestrator | 18:30:09.031 STDOUT terraform:  } 2025-09-23 18:30:09.031605 | orchestrator | 18:30:09.031 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2025-09-23 18:30:09.031747 | orchestrator | 18:30:09.031 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-23 18:30:09.031836 | orchestrator | 18:30:09.031 STDOUT terraform:  + device = (known after apply) 2025-09-23 18:30:09.031932 | orchestrator | 18:30:09.031 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.032017 | orchestrator | 18:30:09.031 STDOUT terraform:  + instance_id = (known after apply) 2025-09-23 18:30:09.032102 | orchestrator | 18:30:09.032 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.032186 | orchestrator | 18:30:09.032 STDOUT terraform:  + volume_id = (known after apply) 2025-09-23 18:30:09.032239 | orchestrator | 18:30:09.032 STDOUT terraform:  } 2025-09-23 18:30:09.032398 | orchestrator | 18:30:09.032 STDOUT terraform:  # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2025-09-23 18:30:09.032549 | orchestrator | 18:30:09.032 STDOUT terraform:  + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2025-09-23 18:30:09.032639 | orchestrator | 18:30:09.032 STDOUT terraform:  + fixed_ip = (known after apply) 2025-09-23 18:30:09.032727 | orchestrator | 18:30:09.032 STDOUT terraform:  + floating_ip = (known after apply) 2025-09-23 18:30:09.032812 | orchestrator | 18:30:09.032 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.032895 | orchestrator | 18:30:09.032 STDOUT terraform:  + port_id = (known after apply) 2025-09-23 18:30:09.032978 | orchestrator | 18:30:09.032 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.033026 | orchestrator | 18:30:09.032 STDOUT terraform:  } 2025-09-23 18:30:09.033156 | orchestrator | 18:30:09.033 STDOUT terraform:  # openstack_networking_floatingip_v2.manager_floating_ip will be created 2025-09-23 18:30:09.033280 | orchestrator | 18:30:09.033 STDOUT terraform:  + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2025-09-23 18:30:09.033356 | orchestrator | 18:30:09.033 STDOUT terraform:  + address = (known after apply) 2025-09-23 18:30:09.033431 | orchestrator | 18:30:09.033 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.033506 | orchestrator | 18:30:09.033 STDOUT terraform:  + dns_domain = (known after apply) 2025-09-23 18:30:09.033581 | orchestrator | 18:30:09.033 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.033653 | orchestrator | 18:30:09.033 STDOUT terraform:  + fixed_ip = (known after apply) 2025-09-23 18:30:09.033736 | orchestrator | 18:30:09.033 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.033802 | orchestrator | 18:30:09.033 STDOUT terraform:  + pool = "public" 2025-09-23 18:30:09.033877 | orchestrator | 18:30:09.033 STDOUT terraform:  + port_id = (known after apply) 2025-09-23 18:30:09.033946 | orchestrator | 18:30:09.033 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.034042 | orchestrator | 18:30:09.033 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.034084 | orchestrator | 18:30:09.034 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.034106 | orchestrator | 18:30:09.034 STDOUT terraform:  } 2025-09-23 18:30:09.034157 | orchestrator | 18:30:09.034 STDOUT terraform:  # openstack_networking_network_v2.net_management will be created 2025-09-23 18:30:09.034209 | orchestrator | 18:30:09.034 STDOUT terraform:  + resource "openstack_networking_network_v2" "net_management" { 2025-09-23 18:30:09.034253 | orchestrator | 18:30:09.034 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.034297 | orchestrator | 18:30:09.034 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.034327 | orchestrator | 18:30:09.034 STDOUT terraform:  + availability_zone_hints = [ 2025-09-23 18:30:09.034351 | orchestrator | 18:30:09.034 STDOUT terraform:  + "nova", 2025-09-23 18:30:09.034374 | orchestrator | 18:30:09.034 STDOUT terraform:  ] 2025-09-23 18:30:09.034418 | orchestrator | 18:30:09.034 STDOUT terraform:  + dns_domain = (known after apply) 2025-09-23 18:30:09.034462 | orchestrator | 18:30:09.034 STDOUT terraform:  + external = (known after apply) 2025-09-23 18:30:09.034506 | orchestrator | 18:30:09.034 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.034549 | orchestrator | 18:30:09.034 STDOUT terraform:  + mtu = (known after apply) 2025-09-23 18:30:09.034596 | orchestrator | 18:30:09.034 STDOUT terraform:  + name = "net-testbed-management" 2025-09-23 18:30:09.034656 | orchestrator | 18:30:09.034 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.034701 | orchestrator | 18:30:09.034 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.034743 | orchestrator | 18:30:09.034 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.034790 | orchestrator | 18:30:09.034 STDOUT terraform:  + shared = (known after apply) 2025-09-23 18:30:09.036025 | orchestrator | 18:30:09.034 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.036107 | orchestrator | 18:30:09.036 STDOUT terraform:  + transparent_vlan = (known after apply) 2025-09-23 18:30:09.036142 | orchestrator | 18:30:09.036 STDOUT terraform:  + segments (known after apply) 2025-09-23 18:30:09.036164 | orchestrator | 18:30:09.036 STDOUT terraform:  } 2025-09-23 18:30:09.036218 | orchestrator | 18:30:09.036 STDOUT terraform:  # openstack_networking_port_v2.manager_port_management will be created 2025-09-23 18:30:09.036275 | orchestrator | 18:30:09.036 STDOUT terraform:  + resource "openstack_networking_port_v2" "manager_port_management" { 2025-09-23 18:30:09.036321 | orchestrator | 18:30:09.036 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.036366 | orchestrator | 18:30:09.036 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.036409 | orchestrator | 18:30:09.036 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.036453 | orchestrator | 18:30:09.036 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.036507 | orchestrator | 18:30:09.036 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.036556 | orchestrator | 18:30:09.036 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.036598 | orchestrator | 18:30:09.036 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.036673 | orchestrator | 18:30:09.036 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.036720 | orchestrator | 18:30:09.036 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.036763 | orchestrator | 18:30:09.036 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.036804 | orchestrator | 18:30:09.036 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.036851 | orchestrator | 18:30:09.036 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.036896 | orchestrator | 18:30:09.036 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.036939 | orchestrator | 18:30:09.036 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.036979 | orchestrator | 18:30:09.036 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.037021 | orchestrator | 18:30:09.036 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.037047 | orchestrator | 18:30:09.037 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.037082 | orchestrator | 18:30:09.037 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.037102 | orchestrator | 18:30:09.037 STDOUT terraform:  } 2025-09-23 18:30:09.037127 | orchestrator | 18:30:09.037 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.037161 | orchestrator | 18:30:09.037 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.037181 | orchestrator | 18:30:09.037 STDOUT terraform:  } 2025-09-23 18:30:09.037211 | orchestrator | 18:30:09.037 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.037233 | orchestrator | 18:30:09.037 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.037289 | orchestrator | 18:30:09.037 STDOUT terraform:  + ip_address = "192.168.16.5" 2025-09-23 18:30:09.037345 | orchestrator | 18:30:09.037 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.037369 | orchestrator | 18:30:09.037 STDOUT terraform:  } 2025-09-23 18:30:09.037390 | orchestrator | 18:30:09.037 STDOUT terraform:  } 2025-09-23 18:30:09.037448 | orchestrator | 18:30:09.037 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[0] will be created 2025-09-23 18:30:09.037499 | orchestrator | 18:30:09.037 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-23 18:30:09.037542 | orchestrator | 18:30:09.037 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.037587 | orchestrator | 18:30:09.037 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.037639 | orchestrator | 18:30:09.037 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.037682 | orchestrator | 18:30:09.037 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.037724 | orchestrator | 18:30:09.037 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.037772 | orchestrator | 18:30:09.037 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.037815 | orchestrator | 18:30:09.037 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.037856 | orchestrator | 18:30:09.037 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.037898 | orchestrator | 18:30:09.037 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.037944 | orchestrator | 18:30:09.037 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.037987 | orchestrator | 18:30:09.037 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.038053 | orchestrator | 18:30:09.037 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.038098 | orchestrator | 18:30:09.038 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.038140 | orchestrator | 18:30:09.038 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.038181 | orchestrator | 18:30:09.038 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.038223 | orchestrator | 18:30:09.038 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.038249 | orchestrator | 18:30:09.038 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.038284 | orchestrator | 18:30:09.038 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.038304 | orchestrator | 18:30:09.038 STDOUT terraform:  } 2025-09-23 18:30:09.038330 | orchestrator | 18:30:09.038 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.038367 | orchestrator | 18:30:09.038 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-23 18:30:09.038388 | orchestrator | 18:30:09.038 STDOUT terraform:  } 2025-09-23 18:30:09.038413 | orchestrator | 18:30:09.038 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.038447 | orchestrator | 18:30:09.038 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.038467 | orchestrator | 18:30:09.038 STDOUT terraform:  } 2025-09-23 18:30:09.038492 | orchestrator | 18:30:09.038 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.038526 | orchestrator | 18:30:09.038 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-23 18:30:09.038546 | orchestrator | 18:30:09.038 STDOUT terraform:  } 2025-09-23 18:30:09.038576 | orchestrator | 18:30:09.038 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.038597 | orchestrator | 18:30:09.038 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.038638 | orchestrator | 18:30:09.038 STDOUT terraform:  + ip_address = "192.168.16.10" 2025-09-23 18:30:09.038676 | orchestrator | 18:30:09.038 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.038696 | orchestrator | 18:30:09.038 STDOUT terraform:  } 2025-09-23 18:30:09.038716 | orchestrator | 18:30:09.038 STDOUT terraform:  } 2025-09-23 18:30:09.038768 | orchestrator | 18:30:09.038 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[1] will be created 2025-09-23 18:30:09.038818 | orchestrator | 18:30:09.038 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-23 18:30:09.038865 | orchestrator | 18:30:09.038 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.038908 | orchestrator | 18:30:09.038 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.038949 | orchestrator | 18:30:09.038 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.038993 | orchestrator | 18:30:09.038 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.039035 | orchestrator | 18:30:09.039 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.039078 | orchestrator | 18:30:09.039 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.039120 | orchestrator | 18:30:09.039 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.039162 | orchestrator | 18:30:09.039 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.039204 | orchestrator | 18:30:09.039 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.039246 | orchestrator | 18:30:09.039 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.039287 | orchestrator | 18:30:09.039 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.039328 | orchestrator | 18:30:09.039 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.039378 | orchestrator | 18:30:09.039 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.039421 | orchestrator | 18:30:09.039 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.039461 | orchestrator | 18:30:09.039 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.039503 | orchestrator | 18:30:09.039 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.039528 | orchestrator | 18:30:09.039 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.039562 | orchestrator | 18:30:09.039 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.039582 | orchestrator | 18:30:09.039 STDOUT terraform:  } 2025-09-23 18:30:09.039607 | orchestrator | 18:30:09.039 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.039652 | orchestrator | 18:30:09.039 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-23 18:30:09.039673 | orchestrator | 18:30:09.039 STDOUT terraform:  } 2025-09-23 18:30:09.039699 | orchestrator | 18:30:09.039 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.039733 | orchestrator | 18:30:09.039 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.039756 | orchestrator | 18:30:09.039 STDOUT terraform:  } 2025-09-23 18:30:09.039782 | orchestrator | 18:30:09.039 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.039818 | orchestrator | 18:30:09.039 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-23 18:30:09.039838 | orchestrator | 18:30:09.039 STDOUT terraform:  } 2025-09-23 18:30:09.039868 | orchestrator | 18:30:09.039 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.039889 | orchestrator | 18:30:09.039 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.039924 | orchestrator | 18:30:09.039 STDOUT terraform:  + ip_address = "192.168.16.11" 2025-09-23 18:30:09.039959 | orchestrator | 18:30:09.039 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.039980 | orchestrator | 18:30:09.039 STDOUT terraform:  } 2025-09-23 18:30:09.040000 | orchestrator | 18:30:09.039 STDOUT terraform:  } 2025-09-23 18:30:09.040052 | orchestrator | 18:30:09.040 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[2] will be created 2025-09-23 18:30:09.040102 | orchestrator | 18:30:09.040 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-23 18:30:09.040144 | orchestrator | 18:30:09.040 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.040185 | orchestrator | 18:30:09.040 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.040225 | orchestrator | 18:30:09.040 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.040267 | orchestrator | 18:30:09.040 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.040317 | orchestrator | 18:30:09.040 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.040360 | orchestrator | 18:30:09.040 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.040402 | orchestrator | 18:30:09.040 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.040444 | orchestrator | 18:30:09.040 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.040486 | orchestrator | 18:30:09.040 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.040526 | orchestrator | 18:30:09.040 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.040567 | orchestrator | 18:30:09.040 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.040607 | orchestrator | 18:30:09.040 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.040669 | orchestrator | 18:30:09.040 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.040714 | orchestrator | 18:30:09.040 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.040756 | orchestrator | 18:30:09.040 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.040797 | orchestrator | 18:30:09.040 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.040824 | orchestrator | 18:30:09.040 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.040858 | orchestrator | 18:30:09.040 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.040879 | orchestrator | 18:30:09.040 STDOUT terraform:  } 2025-09-23 18:30:09.040905 | orchestrator | 18:30:09.040 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.040940 | orchestrator | 18:30:09.040 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-23 18:30:09.040960 | orchestrator | 18:30:09.040 STDOUT terraform:  } 2025-09-23 18:30:09.040986 | orchestrator | 18:30:09.040 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.041020 | orchestrator | 18:30:09.040 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.041044 | orchestrator | 18:30:09.041 STDOUT terraform:  } 2025-09-23 18:30:09.041081 | orchestrator | 18:30:09.041 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.041117 | orchestrator | 18:30:09.041 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-23 18:30:09.041138 | orchestrator | 18:30:09.041 STDOUT terraform:  } 2025-09-23 18:30:09.041167 | orchestrator | 18:30:09.041 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.041188 | orchestrator | 18:30:09.041 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.041218 | orchestrator | 18:30:09.041 STDOUT terraform:  + ip_address = "192.168.16.12" 2025-09-23 18:30:09.041252 | orchestrator | 18:30:09.041 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.041273 | orchestrator | 18:30:09.041 STDOUT terraform:  } 2025-09-23 18:30:09.041292 | orchestrator | 18:30:09.041 STDOUT terraform:  } 2025-09-23 18:30:09.041343 | orchestrator | 18:30:09.041 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[3] will be created 2025-09-23 18:30:09.041402 | orchestrator | 18:30:09.041 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-23 18:30:09.041448 | orchestrator | 18:30:09.041 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.041491 | orchestrator | 18:30:09.041 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.041532 | orchestrator | 18:30:09.041 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.041574 | orchestrator | 18:30:09.041 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.041615 | orchestrator | 18:30:09.041 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.041666 | orchestrator | 18:30:09.041 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.041710 | orchestrator | 18:30:09.041 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.041753 | orchestrator | 18:30:09.041 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.041797 | orchestrator | 18:30:09.041 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.041839 | orchestrator | 18:30:09.041 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.041881 | orchestrator | 18:30:09.041 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.041923 | orchestrator | 18:30:09.041 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.041964 | orchestrator | 18:30:09.041 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.042007 | orchestrator | 18:30:09.041 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.042071 | orchestrator | 18:30:09.042 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.042118 | orchestrator | 18:30:09.042 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.042144 | orchestrator | 18:30:09.042 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.042179 | orchestrator | 18:30:09.042 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.042202 | orchestrator | 18:30:09.042 STDOUT terraform:  } 2025-09-23 18:30:09.042227 | orchestrator | 18:30:09.042 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.042262 | orchestrator | 18:30:09.042 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-23 18:30:09.042282 | orchestrator | 18:30:09.042 STDOUT terraform:  } 2025-09-23 18:30:09.042308 | orchestrator | 18:30:09.042 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.042342 | orchestrator | 18:30:09.042 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.042363 | orchestrator | 18:30:09.042 STDOUT terraform:  } 2025-09-23 18:30:09.042388 | orchestrator | 18:30:09.042 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.042421 | orchestrator | 18:30:09.042 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-23 18:30:09.042441 | orchestrator | 18:30:09.042 STDOUT terraform:  } 2025-09-23 18:30:09.042470 | orchestrator | 18:30:09.042 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.042490 | orchestrator | 18:30:09.042 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.042521 | orchestrator | 18:30:09.042 STDOUT terraform:  + ip_address = "192.168.16.13" 2025-09-23 18:30:09.042556 | orchestrator | 18:30:09.042 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.042577 | orchestrator | 18:30:09.042 STDOUT terraform:  } 2025-09-23 18:30:09.042596 | orchestrator | 18:30:09.042 STDOUT terraform:  } 2025-09-23 18:30:09.042672 | orchestrator | 18:30:09.042 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[4] will be created 2025-09-23 18:30:09.042724 | orchestrator | 18:30:09.042 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-23 18:30:09.042773 | orchestrator | 18:30:09.042 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.042816 | orchestrator | 18:30:09.042 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.042856 | orchestrator | 18:30:09.042 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.042898 | orchestrator | 18:30:09.042 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.042938 | orchestrator | 18:30:09.042 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.042980 | orchestrator | 18:30:09.042 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.043022 | orchestrator | 18:30:09.042 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.043070 | orchestrator | 18:30:09.043 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.043114 | orchestrator | 18:30:09.043 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.043155 | orchestrator | 18:30:09.043 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.043202 | orchestrator | 18:30:09.043 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.043248 | orchestrator | 18:30:09.043 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.043293 | orchestrator | 18:30:09.043 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.043334 | orchestrator | 18:30:09.043 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.043376 | orchestrator | 18:30:09.043 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.043417 | orchestrator | 18:30:09.043 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.043445 | orchestrator | 18:30:09.043 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.043479 | orchestrator | 18:30:09.043 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.043499 | orchestrator | 18:30:09.043 STDOUT terraform:  } 2025-09-23 18:30:09.043524 | orchestrator | 18:30:09.043 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.043566 | orchestrator | 18:30:09.043 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-23 18:30:09.043588 | orchestrator | 18:30:09.043 STDOUT terraform:  } 2025-09-23 18:30:09.043613 | orchestrator | 18:30:09.043 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.043655 | orchestrator | 18:30:09.043 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.043675 | orchestrator | 18:30:09.043 STDOUT terraform:  } 2025-09-23 18:30:09.043700 | orchestrator | 18:30:09.043 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.043735 | orchestrator | 18:30:09.043 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-23 18:30:09.043755 | orchestrator | 18:30:09.043 STDOUT terraform:  } 2025-09-23 18:30:09.043784 | orchestrator | 18:30:09.043 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.043804 | orchestrator | 18:30:09.043 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.043837 | orchestrator | 18:30:09.043 STDOUT terraform:  + ip_address = "192.168.16.14" 2025-09-23 18:30:09.043873 | orchestrator | 18:30:09.043 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.043892 | orchestrator | 18:30:09.043 STDOUT terraform:  } 2025-09-23 18:30:09.043912 | orchestrator | 18:30:09.043 STDOUT terraform:  } 2025-09-23 18:30:09.043963 | orchestrator | 18:30:09.043 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[5] will be created 2025-09-23 18:30:09.044022 | orchestrator | 18:30:09.043 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-23 18:30:09.044074 | orchestrator | 18:30:09.044 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.044149 | orchestrator | 18:30:09.044 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-23 18:30:09.044207 | orchestrator | 18:30:09.044 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-23 18:30:09.044252 | orchestrator | 18:30:09.044 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.044307 | orchestrator | 18:30:09.044 STDOUT terraform:  + device_id = (known after apply) 2025-09-23 18:30:09.044362 | orchestrator | 18:30:09.044 STDOUT terraform:  + device_owner = (known after apply) 2025-09-23 18:30:09.044405 | orchestrator | 18:30:09.044 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-23 18:30:09.044464 | orchestrator | 18:30:09.044 STDOUT terraform:  + dns_name = (known after apply) 2025-09-23 18:30:09.044521 | orchestrator | 18:30:09.044 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.044564 | orchestrator | 18:30:09.044 STDOUT terraform:  + mac_address = (known after apply) 2025-09-23 18:30:09.044618 | orchestrator | 18:30:09.044 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.044683 | orchestrator | 18:30:09.044 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-23 18:30:09.044733 | orchestrator | 18:30:09.044 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-23 18:30:09.044779 | orchestrator | 18:30:09.044 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.044833 | orchestrator | 18:30:09.044 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-23 18:30:09.044875 | orchestrator | 18:30:09.044 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.044914 | orchestrator | 18:30:09.044 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.044949 | orchestrator | 18:30:09.044 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-23 18:30:09.044981 | orchestrator | 18:30:09.044 STDOUT terraform:  } 2025-09-23 18:30:09.045007 | orchestrator | 18:30:09.044 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.045060 | orchestrator | 18:30:09.045 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-23 18:30:09.045084 | orchestrator | 18:30:09.045 STDOUT terraform:  } 2025-09-23 18:30:09.045110 | orchestrator | 18:30:09.045 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.045156 | orchestrator | 18:30:09.045 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-23 18:30:09.045176 | orchestrator | 18:30:09.045 STDOUT terraform:  } 2025-09-23 18:30:09.045215 | orchestrator | 18:30:09.045 STDOUT terraform:  + allowed_address_pairs { 2025-09-23 18:30:09.045250 | orchestrator | 18:30:09.045 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-23 18:30:09.045280 | orchestrator | 18:30:09.045 STDOUT terraform:  } 2025-09-23 18:30:09.045316 | orchestrator | 18:30:09.045 STDOUT terraform:  + binding (known after apply) 2025-09-23 18:30:09.045336 | orchestrator | 18:30:09.045 STDOUT terraform:  + fixed_ip { 2025-09-23 18:30:09.045380 | orchestrator | 18:30:09.045 STDOUT terraform:  + ip_address = "192.168.16.15" 2025-09-23 18:30:09.045415 | orchestrator | 18:30:09.045 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.045443 | orchestrator | 18:30:09.045 STDOUT terraform:  } 2025-09-23 18:30:09.045469 | orchestrator | 18:30:09.045 STDOUT terraform:  } 2025-09-23 18:30:09.045534 | orchestrator | 18:30:09.045 STDOUT terraform:  # openstack_networking_router_interface_v2.router_interface will be created 2025-09-23 18:30:09.045588 | orchestrator | 18:30:09.045 STDOUT terraform:  + resource "openstack_networking_router_interface_v2" "router_interface" { 2025-09-23 18:30:09.045652 | orchestrator | 18:30:09.045 STDOUT terraform:  + force_destroy = false 2025-09-23 18:30:09.045707 | orchestrator | 18:30:09.045 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.045756 | orchestrator | 18:30:09.045 STDOUT terraform:  + port_id = (known after apply) 2025-09-23 18:30:09.045798 | orchestrator | 18:30:09.045 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.045842 | orchestrator | 18:30:09.045 STDOUT terraform:  + router_id = (known after apply) 2025-09-23 18:30:09.045883 | orchestrator | 18:30:09.045 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-23 18:30:09.045904 | orchestrator | 18:30:09.045 STDOUT terraform:  } 2025-09-23 18:30:09.045959 | orchestrator | 18:30:09.045 STDOUT terraform:  # openstack_networking_router_v2.router will be created 2025-09-23 18:30:09.046011 | orchestrator | 18:30:09.045 STDOUT terraform:  + resource "openstack_networking_router_v2" "router" { 2025-09-23 18:30:09.046070 | orchestrator | 18:30:09.046 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-23 18:30:09.046125 | orchestrator | 18:30:09.046 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.046162 | orchestrator | 18:30:09.046 STDOUT terraform:  + availability_zone_hints = [ 2025-09-23 18:30:09.046191 | orchestrator | 18:30:09.046 STDOUT terraform:  + "nova", 2025-09-23 18:30:09.046212 | orchestrator | 18:30:09.046 STDOUT terraform:  ] 2025-09-23 18:30:09.046266 | orchestrator | 18:30:09.046 STDOUT terraform:  + distributed = (known after apply) 2025-09-23 18:30:09.046309 | orchestrator | 18:30:09.046 STDOUT terraform:  + enable_snat = (known after apply) 2025-09-23 18:30:09.046375 | orchestrator | 18:30:09.046 STDOUT terraform:  + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2025-09-23 18:30:09.046436 | orchestrator | 18:30:09.046 STDOUT terraform:  + external_qos_policy_id = (known after apply) 2025-09-23 18:30:09.046494 | orchestrator | 18:30:09.046 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.046531 | orchestrator | 18:30:09.046 STDOUT terraform:  + name = "testbed" 2025-09-23 18:30:09.046586 | orchestrator | 18:30:09.046 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.046648 | orchestrator | 18:30:09.046 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.046686 | orchestrator | 18:30:09.046 STDOUT terraform:  + external_fixed_ip (known after apply) 2025-09-23 18:30:09.046720 | orchestrator | 18:30:09.046 STDOUT terraform:  } 2025-09-23 18:30:09.046794 | orchestrator | 18:30:09.046 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2025-09-23 18:30:09.046861 | orchestrator | 18:30:09.046 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2025-09-23 18:30:09.046898 | orchestrator | 18:30:09.046 STDOUT terraform:  + description = "ssh" 2025-09-23 18:30:09.046935 | orchestrator | 18:30:09.046 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.046980 | orchestrator | 18:30:09.046 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.047037 | orchestrator | 18:30:09.046 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.047071 | orchestrator | 18:30:09.047 STDOUT terraform:  + port_range_max = 22 2025-09-23 18:30:09.047116 | orchestrator | 18:30:09.047 STDOUT terraform:  + port_range_min = 22 2025-09-23 18:30:09.047154 | orchestrator | 18:30:09.047 STDOUT terraform:  + protocol = "tcp" 2025-09-23 18:30:09.047211 | orchestrator | 18:30:09.047 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.047268 | orchestrator | 18:30:09.047 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.047311 | orchestrator | 18:30:09.047 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.047360 | orchestrator | 18:30:09.047 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.047415 | orchestrator | 18:30:09.047 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.047459 | orchestrator | 18:30:09.047 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.047487 | orchestrator | 18:30:09.047 STDOUT terraform:  } 2025-09-23 18:30:09.047552 | orchestrator | 18:30:09.047 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2025-09-23 18:30:09.047630 | orchestrator | 18:30:09.047 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2025-09-23 18:30:09.047680 | orchestrator | 18:30:09.047 STDOUT terraform:  + description = "wireguard" 2025-09-23 18:30:09.047724 | orchestrator | 18:30:09.047 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.047762 | orchestrator | 18:30:09.047 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.047820 | orchestrator | 18:30:09.047 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.047853 | orchestrator | 18:30:09.047 STDOUT terraform:  + port_range_max = 51820 2025-09-23 18:30:09.047898 | orchestrator | 18:30:09.047 STDOUT terraform:  + port_range_min = 51820 2025-09-23 18:30:09.047931 | orchestrator | 18:30:09.047 STDOUT terraform:  + protocol = "udp" 2025-09-23 18:30:09.047987 | orchestrator | 18:30:09.047 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.048037 | orchestrator | 18:30:09.047 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.048087 | orchestrator | 18:30:09.048 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.048123 | orchestrator | 18:30:09.048 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.048180 | orchestrator | 18:30:09.048 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.048225 | orchestrator | 18:30:09.048 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.048246 | orchestrator | 18:30:09.048 STDOUT terraform:  } 2025-09-23 18:30:09.048319 | orchestrator | 18:30:09.048 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2025-09-23 18:30:09.048379 | orchestrator | 18:30:09.048 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2025-09-23 18:30:09.048416 | orchestrator | 18:30:09.048 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.048448 | orchestrator | 18:30:09.048 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.048509 | orchestrator | 18:30:09.048 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.048541 | orchestrator | 18:30:09.048 STDOUT terraform:  + protocol = "tcp" 2025-09-23 18:30:09.048585 | orchestrator | 18:30:09.048 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.048645 | orchestrator | 18:30:09.048 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.048690 | orchestrator | 18:30:09.048 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.048734 | orchestrator | 18:30:09.048 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-09-23 18:30:09.048776 | orchestrator | 18:30:09.048 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.048820 | orchestrator | 18:30:09.048 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.048842 | orchestrator | 18:30:09.048 STDOUT terraform:  } 2025-09-23 18:30:09.048915 | orchestrator | 18:30:09.048 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2025-09-23 18:30:09.048975 | orchestrator | 18:30:09.048 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2025-09-23 18:30:09.049012 | orchestrator | 18:30:09.048 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.049045 | orchestrator | 18:30:09.049 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.049089 | orchestrator | 18:30:09.049 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.049121 | orchestrator | 18:30:09.049 STDOUT terraform:  + protocol = "udp" 2025-09-23 18:30:09.049165 | orchestrator | 18:30:09.049 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.049208 | orchestrator | 18:30:09.049 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.049251 | orchestrator | 18:30:09.049 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.049294 | orchestrator | 18:30:09.049 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-09-23 18:30:09.049337 | orchestrator | 18:30:09.049 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.049380 | orchestrator | 18:30:09.049 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.049401 | orchestrator | 18:30:09.049 STDOUT terraform:  } 2025-09-23 18:30:09.049460 | orchestrator | 18:30:09.049 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2025-09-23 18:30:09.049519 | orchestrator | 18:30:09.049 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2025-09-23 18:30:09.049556 | orchestrator | 18:30:09.049 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.049599 | orchestrator | 18:30:09.049 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.049651 | orchestrator | 18:30:09.049 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.049685 | orchestrator | 18:30:09.049 STDOUT terraform:  + protocol = "icmp" 2025-09-23 18:30:09.049734 | orchestrator | 18:30:09.049 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.049778 | orchestrator | 18:30:09.049 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.049821 | orchestrator | 18:30:09.049 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.049859 | orchestrator | 18:30:09.049 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.049902 | orchestrator | 18:30:09.049 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.049945 | orchestrator | 18:30:09.049 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.049966 | orchestrator | 18:30:09.049 STDOUT terraform:  } 2025-09-23 18:30:09.050041 | orchestrator | 18:30:09.049 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2025-09-23 18:30:09.050101 | orchestrator | 18:30:09.050 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2025-09-23 18:30:09.050138 | orchestrator | 18:30:09.050 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.050170 | orchestrator | 18:30:09.050 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.050214 | orchestrator | 18:30:09.050 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.050246 | orchestrator | 18:30:09.050 STDOUT terraform:  + protocol = "tcp" 2025-09-23 18:30:09.050290 | orchestrator | 18:30:09.050 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.050331 | orchestrator | 18:30:09.050 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.050374 | orchestrator | 18:30:09.050 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.050410 | orchestrator | 18:30:09.050 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.050454 | orchestrator | 18:30:09.050 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.050498 | orchestrator | 18:30:09.050 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.050519 | orchestrator | 18:30:09.050 STDOUT terraform:  } 2025-09-23 18:30:09.050577 | orchestrator | 18:30:09.050 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2025-09-23 18:30:09.050643 | orchestrator | 18:30:09.050 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2025-09-23 18:30:09.050679 | orchestrator | 18:30:09.050 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.050710 | orchestrator | 18:30:09.050 STDOUT terraform:  + ethertype = "IPv4 2025-09-23 18:30:09.050763 | orchestrator | 18:30:09.050 STDOUT terraform: " 2025-09-23 18:30:09.050808 | orchestrator | 18:30:09.050 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.050839 | orchestrator | 18:30:09.050 STDOUT terraform:  + protocol = "udp" 2025-09-23 18:30:09.050881 | orchestrator | 18:30:09.050 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.050938 | orchestrator | 18:30:09.050 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.050985 | orchestrator | 18:30:09.050 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.051021 | orchestrator | 18:30:09.050 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.051063 | orchestrator | 18:30:09.051 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.051105 | orchestrator | 18:30:09.051 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.051125 | orchestrator | 18:30:09.051 STDOUT terraform:  } 2025-09-23 18:30:09.051180 | orchestrator | 18:30:09.051 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2025-09-23 18:30:09.051235 | orchestrator | 18:30:09.051 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2025-09-23 18:30:09.051270 | orchestrator | 18:30:09.051 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.051300 | orchestrator | 18:30:09.051 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.051344 | orchestrator | 18:30:09.051 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.051376 | orchestrator | 18:30:09.051 STDOUT terraform:  + protocol = "icmp" 2025-09-23 18:30:09.051417 | orchestrator | 18:30:09.051 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.051458 | orchestrator | 18:30:09.051 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.051499 | orchestrator | 18:30:09.051 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.051533 | orchestrator | 18:30:09.051 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.051574 | orchestrator | 18:30:09.051 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.051616 | orchestrator | 18:30:09.051 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.051655 | orchestrator | 18:30:09.051 STDOUT terraform:  } 2025-09-23 18:30:09.051711 | orchestrator | 18:30:09.051 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2025-09-23 18:30:09.051768 | orchestrator | 18:30:09.051 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2025-09-23 18:30:09.051799 | orchestrator | 18:30:09.051 STDOUT terraform:  + description = "vrrp" 2025-09-23 18:30:09.051836 | orchestrator | 18:30:09.051 STDOUT terraform:  + direction = "ingress" 2025-09-23 18:30:09.051880 | orchestrator | 18:30:09.051 STDOUT terraform:  + ethertype = "IPv4" 2025-09-23 18:30:09.051926 | orchestrator | 18:30:09.051 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.051958 | orchestrator | 18:30:09.051 STDOUT terraform:  + protocol = "112" 2025-09-23 18:30:09.052000 | orchestrator | 18:30:09.051 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.052041 | orchestrator | 18:30:09.052 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-23 18:30:09.052082 | orchestrator | 18:30:09.052 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-23 18:30:09.052118 | orchestrator | 18:30:09.052 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-23 18:30:09.052164 | orchestrator | 18:30:09.052 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-23 18:30:09.052206 | orchestrator | 18:30:09.052 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.052227 | orchestrator | 18:30:09.052 STDOUT terraform:  } 2025-09-23 18:30:09.052281 | orchestrator | 18:30:09.052 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_management will be created 2025-09-23 18:30:09.052334 | orchestrator | 18:30:09.052 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_management" { 2025-09-23 18:30:09.052369 | orchestrator | 18:30:09.052 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.052409 | orchestrator | 18:30:09.052 STDOUT terraform:  + description = "management security group" 2025-09-23 18:30:09.052443 | orchestrator | 18:30:09.052 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.052477 | orchestrator | 18:30:09.052 STDOUT terraform:  + name = "testbed-management" 2025-09-23 18:30:09.052510 | orchestrator | 18:30:09.052 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.052546 | orchestrator | 18:30:09.052 STDOUT terraform:  + stateful = (known after apply) 2025-09-23 18:30:09.052580 | orchestrator | 18:30:09.052 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.052599 | orchestrator | 18:30:09.052 STDOUT terraform:  } 2025-09-23 18:30:09.052663 | orchestrator | 18:30:09.052 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_node will be created 2025-09-23 18:30:09.052718 | orchestrator | 18:30:09.052 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_node" { 2025-09-23 18:30:09.052754 | orchestrator | 18:30:09.052 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.052795 | orchestrator | 18:30:09.052 STDOUT terraform:  + description = "node security group" 2025-09-23 18:30:09.052830 | orchestrator | 18:30:09.052 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.052860 | orchestrator | 18:30:09.052 STDOUT terraform:  + name = "testbed-node" 2025-09-23 18:30:09.052894 | orchestrator | 18:30:09.052 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.052927 | orchestrator | 18:30:09.052 STDOUT terraform:  + stateful = (known after apply) 2025-09-23 18:30:09.052961 | orchestrator | 18:30:09.052 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.052981 | orchestrator | 18:30:09.052 STDOUT terraform:  } 2025-09-23 18:30:09.053032 | orchestrator | 18:30:09.052 STDOUT terraform:  # openstack_networking_subnet_v2.subnet_management will be created 2025-09-23 18:30:09.053082 | orchestrator | 18:30:09.053 STDOUT terraform:  + resource "openstack_networking_subnet_v2" "subnet_management" { 2025-09-23 18:30:09.053120 | orchestrator | 18:30:09.053 STDOUT terraform:  + all_tags = (known after apply) 2025-09-23 18:30:09.053155 | orchestrator | 18:30:09.053 STDOUT terraform:  + cidr = "192.168.16.0/20" 2025-09-23 18:30:09.053181 | orchestrator | 18:30:09.053 STDOUT terraform:  + dns_nameservers = [ 2025-09-23 18:30:09.053203 | orchestrator | 18:30:09.053 STDOUT terraform:  + "8.8.8.8", 2025-09-23 18:30:09.053230 | orchestrator | 18:30:09.053 STDOUT terraform:  + "9.9.9.9", 2025-09-23 18:30:09.053250 | orchestrator | 18:30:09.053 STDOUT terraform:  ] 2025-09-23 18:30:09.053277 | orchestrator | 18:30:09.053 STDOUT terraform:  + enable_dhcp = true 2025-09-23 18:30:09.053313 | orchestrator | 18:30:09.053 STDOUT terraform:  + gateway_ip = (known after apply) 2025-09-23 18:30:09.053349 | orchestrator | 18:30:09.053 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.053376 | orchestrator | 18:30:09.053 STDOUT terraform:  + ip_version = 4 2025-09-23 18:30:09.053412 | orchestrator | 18:30:09.053 STDOUT terraform:  + ipv6_address_mode = (known after apply) 2025-09-23 18:30:09.053448 | orchestrator | 18:30:09.053 STDOUT terraform:  + ipv6_ra_mode = (known after apply) 2025-09-23 18:30:09.053490 | orchestrator | 18:30:09.053 STDOUT terraform:  + name = "subnet-testbed-management" 2025-09-23 18:30:09.053537 | orchestrator | 18:30:09.053 STDOUT terraform:  + network_id = (known after apply) 2025-09-23 18:30:09.053565 | orchestrator | 18:30:09.053 STDOUT terraform:  + no_gateway = false 2025-09-23 18:30:09.053603 | orchestrator | 18:30:09.053 STDOUT terraform:  + region = (known after apply) 2025-09-23 18:30:09.053646 | orchestrator | 18:30:09.053 STDOUT terraform:  + service_types = (known after apply) 2025-09-23 18:30:09.053683 | orchestrator | 18:30:09.053 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-23 18:30:09.053708 | orchestrator | 18:30:09.053 STDOUT terraform:  + allocation_pool { 2025-09-23 18:30:09.053738 | orchestrator | 18:30:09.053 STDOUT terraform:  + end = "192.168.31.250" 2025-09-23 18:30:09.053768 | orchestrator | 18:30:09.053 STDOUT terraform:  + start = "192.168.31.200" 2025-09-23 18:30:09.053789 | orchestrator | 18:30:09.053 STDOUT terraform:  } 2025-09-23 18:30:09.053808 | orchestrator | 18:30:09.053 STDOUT terraform:  } 2025-09-23 18:30:09.053838 | orchestrator | 18:30:09.053 STDOUT terraform:  # terraform_data.image will be created 2025-09-23 18:30:09.053868 | orchestrator | 18:30:09.053 STDOUT terraform:  + resource "terraform_data" "image" { 2025-09-23 18:30:09.053908 | orchestrator | 18:30:09.053 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.053935 | orchestrator | 18:30:09.053 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-09-23 18:30:09.053966 | orchestrator | 18:30:09.053 STDOUT terraform:  + output = (known after apply) 2025-09-23 18:30:09.053986 | orchestrator | 18:30:09.053 STDOUT terraform:  } 2025-09-23 18:30:09.054038 | orchestrator | 18:30:09.053 STDOUT terraform:  # terraform_data.image_node will be created 2025-09-23 18:30:09.054077 | orchestrator | 18:30:09.054 STDOUT terraform:  + resource "terraform_data" "image_node" { 2025-09-23 18:30:09.054108 | orchestrator | 18:30:09.054 STDOUT terraform:  + id = (known after apply) 2025-09-23 18:30:09.054135 | orchestrator | 18:30:09.054 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-09-23 18:30:09.054164 | orchestrator | 18:30:09.054 STDOUT terraform:  + output = (known after apply) 2025-09-23 18:30:09.054183 | orchestrator | 18:30:09.054 STDOUT terraform:  } 2025-09-23 18:30:09.054219 | orchestrator | 18:30:09.054 STDOUT terraform: Plan: 64 to add, 0 to change, 0 to destroy. 2025-09-23 18:30:09.054244 | orchestrator | 18:30:09.054 STDOUT terraform: Changes to Outputs: 2025-09-23 18:30:09.054274 | orchestrator | 18:30:09.054 STDOUT terraform:  + manager_address = (sensitive value) 2025-09-23 18:30:09.054305 | orchestrator | 18:30:09.054 STDOUT terraform:  + private_key = (sensitive value) 2025-09-23 18:30:09.123292 | orchestrator | 18:30:09.122 STDOUT terraform: terraform_data.image_node: Creating... 2025-09-23 18:30:09.123327 | orchestrator | 18:30:09.122 STDOUT terraform: terraform_data.image_node: Creation complete after 0s [id=f3271978-d572-3734-31b5-ef221551422a] 2025-09-23 18:30:09.204325 | orchestrator | 18:30:09.204 STDOUT terraform: terraform_data.image: Creating... 2025-09-23 18:30:09.205530 | orchestrator | 18:30:09.205 STDOUT terraform: terraform_data.image: Creation complete after 0s [id=7d11403a-8cd8-bfbb-2836-1a373c50e167] 2025-09-23 18:30:09.228030 | orchestrator | 18:30:09.227 STDOUT terraform: data.openstack_images_image_v2.image_node: Reading... 2025-09-23 18:30:09.228068 | orchestrator | 18:30:09.227 STDOUT terraform: data.openstack_images_image_v2.image: Reading... 2025-09-23 18:30:09.236421 | orchestrator | 18:30:09.233 STDOUT terraform: openstack_compute_keypair_v2.key: Creating... 2025-09-23 18:30:09.237026 | orchestrator | 18:30:09.236 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2025-09-23 18:30:09.237308 | orchestrator | 18:30:09.237 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2025-09-23 18:30:09.238134 | orchestrator | 18:30:09.237 STDOUT terraform: openstack_networking_network_v2.net_management: Creating... 2025-09-23 18:30:09.239855 | orchestrator | 18:30:09.239 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2025-09-23 18:30:09.244382 | orchestrator | 18:30:09.244 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2025-09-23 18:30:09.244565 | orchestrator | 18:30:09.244 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2025-09-23 18:30:09.244862 | orchestrator | 18:30:09.244 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2025-09-23 18:30:09.931899 | orchestrator | 18:30:09.931 STDOUT terraform: data.openstack_images_image_v2.image_node: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2025-09-23 18:30:09.938058 | orchestrator | 18:30:09.936 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2025-09-23 18:30:09.983380 | orchestrator | 18:30:09.983 STDOUT terraform: data.openstack_images_image_v2.image: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2025-09-23 18:30:09.987987 | orchestrator | 18:30:09.987 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2025-09-23 18:30:10.067967 | orchestrator | 18:30:10.067 STDOUT terraform: openstack_compute_keypair_v2.key: Creation complete after 1s [id=testbed] 2025-09-23 18:30:10.073184 | orchestrator | 18:30:10.073 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2025-09-23 18:30:10.660072 | orchestrator | 18:30:10.659 STDOUT terraform: openstack_networking_network_v2.net_management: Creation complete after 2s [id=4708ea29-d0a0-4899-832d-d7eca38ad41e] 2025-09-23 18:30:10.670784 | orchestrator | 18:30:10.668 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2025-09-23 18:30:13.176163 | orchestrator | 18:30:13.173 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 4s [id=2f832cfd-0250-47f3-a635-d697408042bd] 2025-09-23 18:30:13.176229 | orchestrator | 18:30:13.174 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 4s [id=d82469de-3742-489b-9a9c-b38cbdf5a8bd] 2025-09-23 18:30:13.189950 | orchestrator | 18:30:13.189 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2025-09-23 18:30:13.190493 | orchestrator | 18:30:13.190 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2025-09-23 18:30:13.224500 | orchestrator | 18:30:13.224 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 4s [id=e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8] 2025-09-23 18:30:13.236069 | orchestrator | 18:30:13.235 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2025-09-23 18:30:13.255084 | orchestrator | 18:30:13.254 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 4s [id=c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5] 2025-09-23 18:30:13.265531 | orchestrator | 18:30:13.265 STDOUT terraform: local_sensitive_file.id_rsa: Creating... 2025-09-23 18:30:13.269888 | orchestrator | 18:30:13.269 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 4s [id=ad3d32bb-3e57-4330-95b4-3d115fcffc85] 2025-09-23 18:30:13.273470 | orchestrator | 18:30:13.273 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2025-09-23 18:30:13.273956 | orchestrator | 18:30:13.273 STDOUT terraform: local_sensitive_file.id_rsa: Creation complete after 0s [id=b9d9212c9aa351c9c70305726ac6f2aaf845be51] 2025-09-23 18:30:13.283436 | orchestrator | 18:30:13.283 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2025-09-23 18:30:13.290224 | orchestrator | 18:30:13.290 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 4s [id=fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e] 2025-09-23 18:30:13.301507 | orchestrator | 18:30:13.301 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2025-09-23 18:30:13.317915 | orchestrator | 18:30:13.317 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 3s [id=8202d0db-f0b8-43bb-b5ae-a89817ca1052] 2025-09-23 18:30:13.324670 | orchestrator | 18:30:13.324 STDOUT terraform: local_file.id_rsa_pub: Creating... 2025-09-23 18:30:13.327452 | orchestrator | 18:30:13.327 STDOUT terraform: local_file.id_rsa_pub: Creation complete after 0s [id=2bb3f4c1bd6be6fddddc2d4327bff75bef675d4a] 2025-09-23 18:30:13.332364 | orchestrator | 18:30:13.332 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creating... 2025-09-23 18:30:13.355163 | orchestrator | 18:30:13.355 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 3s [id=8164be3f-bf64-45a9-9145-7091701f0cb6] 2025-09-23 18:30:13.583251 | orchestrator | 18:30:13.582 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 4s [id=d7d70b4c-e10d-4821-8a70-30b75615b27b] 2025-09-23 18:30:14.080040 | orchestrator | 18:30:14.079 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 3s [id=40586798-a938-4a0a-ac1b-5e3307fb08ff] 2025-09-23 18:30:14.487651 | orchestrator | 18:30:14.487 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creation complete after 1s [id=58c1d78e-cafc-48bf-b2f4-54327ce1c75b] 2025-09-23 18:30:14.495818 | orchestrator | 18:30:14.495 STDOUT terraform: openstack_networking_router_v2.router: Creating... 2025-09-23 18:30:16.666863 | orchestrator | 18:30:16.666 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 4s [id=3a8bf4eb-6835-436a-8a3d-3e86e0ef5705] 2025-09-23 18:30:16.693887 | orchestrator | 18:30:16.693 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 4s [id=09b4bbbe-f80b-4184-957d-358c53e5aa05] 2025-09-23 18:30:16.751507 | orchestrator | 18:30:16.751 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 4s [id=48d13180-cb46-42fb-bb48-4118091051be] 2025-09-23 18:30:16.767734 | orchestrator | 18:30:16.767 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 4s [id=3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f] 2025-09-23 18:30:16.782699 | orchestrator | 18:30:16.782 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 4s [id=0e48d10f-7bad-48f6-8de6-4bf624069e37] 2025-09-23 18:30:16.880138 | orchestrator | 18:30:16.879 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 4s [id=111e41fd-1cdd-43db-a49a-f2bb4cafdaf0] 2025-09-23 18:30:17.471442 | orchestrator | 18:30:17.471 STDOUT terraform: openstack_networking_router_v2.router: Creation complete after 3s [id=0bf83688-4e85-4219-b0aa-f33f13cfb30d] 2025-09-23 18:30:17.483254 | orchestrator | 18:30:17.483 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creating... 2025-09-23 18:30:17.484096 | orchestrator | 18:30:17.483 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creating... 2025-09-23 18:30:17.484662 | orchestrator | 18:30:17.484 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creating... 2025-09-23 18:30:17.705908 | orchestrator | 18:30:17.705 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creation complete after 1s [id=2c8e049b-9c60-4c95-bdad-2de26e3975e3] 2025-09-23 18:30:17.724843 | orchestrator | 18:30:17.724 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2025-09-23 18:30:17.725747 | orchestrator | 18:30:17.725 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2025-09-23 18:30:17.726884 | orchestrator | 18:30:17.726 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2025-09-23 18:30:17.733973 | orchestrator | 18:30:17.733 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2025-09-23 18:30:17.734219 | orchestrator | 18:30:17.734 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creation complete after 1s [id=10810df1-3622-4986-88b7-3fdb8b66a235] 2025-09-23 18:30:17.734485 | orchestrator | 18:30:17.734 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creating... 2025-09-23 18:30:17.737429 | orchestrator | 18:30:17.737 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creating... 2025-09-23 18:30:17.741599 | orchestrator | 18:30:17.741 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creating... 2025-09-23 18:30:17.744079 | orchestrator | 18:30:17.743 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creating... 2025-09-23 18:30:17.747533 | orchestrator | 18:30:17.747 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creating... 2025-09-23 18:30:17.934315 | orchestrator | 18:30:17.933 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 0s [id=11d587ad-f179-4172-a719-079554d17ee4] 2025-09-23 18:30:17.946216 | orchestrator | 18:30:17.945 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creating... 2025-09-23 18:30:18.124588 | orchestrator | 18:30:18.124 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 0s [id=8b4f1d52-4946-4129-a5c3-1a592cd2046a] 2025-09-23 18:30:18.132201 | orchestrator | 18:30:18.131 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2025-09-23 18:30:18.387121 | orchestrator | 18:30:18.386 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 0s [id=82004ecb-d81e-494d-9169-50148ec267ba] 2025-09-23 18:30:18.395273 | orchestrator | 18:30:18.394 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2025-09-23 18:30:18.552464 | orchestrator | 18:30:18.552 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creation complete after 1s [id=c9051d6a-8cd2-4ea4-aea7-06d311e811bb] 2025-09-23 18:30:18.556806 | orchestrator | 18:30:18.556 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2025-09-23 18:30:18.619993 | orchestrator | 18:30:18.619 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 1s [id=3064ce27-1aff-473a-af7d-cbd85a7bdc6b] 2025-09-23 18:30:18.625791 | orchestrator | 18:30:18.625 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2025-09-23 18:30:18.841381 | orchestrator | 18:30:18.840 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 1s [id=54a17f4f-97fe-41b7-80a3-c83e5269d8a2] 2025-09-23 18:30:18.852946 | orchestrator | 18:30:18.852 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creating... 2025-09-23 18:30:18.928350 | orchestrator | 18:30:18.928 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creation complete after 1s [id=30778571-0825-4754-900a-c66befc54776] 2025-09-23 18:30:18.934533 | orchestrator | 18:30:18.934 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2025-09-23 18:30:18.935223 | orchestrator | 18:30:18.935 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creation complete after 1s [id=f513b74f-4aec-4175-a098-baba8ff8fa96] 2025-09-23 18:30:18.939714 | orchestrator | 18:30:18.939 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creation complete after 1s [id=2a72add7-a386-4673-a114-4869f46d87df] 2025-09-23 18:30:18.996241 | orchestrator | 18:30:18.995 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 0s [id=a32929d3-4927-4e4f-93ca-6924e6426aaa] 2025-09-23 18:30:19.222093 | orchestrator | 18:30:19.221 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 0s [id=148eb212-e3bb-4641-977e-6b4caa05636d] 2025-09-23 18:30:19.310159 | orchestrator | 18:30:19.309 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 1s [id=f2734fe7-c443-46a8-9ec5-85b774b8be3c] 2025-09-23 18:30:19.409388 | orchestrator | 18:30:19.409 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 0s [id=4fde664b-3bf6-4eaa-a482-b96ea73a042d] 2025-09-23 18:30:19.421449 | orchestrator | 18:30:19.421 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creation complete after 1s [id=0f3b4af0-1afb-4a39-9b22-2c6d6c2c2a07] 2025-09-23 18:30:19.451435 | orchestrator | 18:30:19.451 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creation complete after 0s [id=4347530c-3b15-41ea-8190-9eeef2f833e1] 2025-09-23 18:30:19.504412 | orchestrator | 18:30:19.504 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creation complete after 2s [id=0a508350-954e-4034-80fb-127600f3af04] 2025-09-23 18:30:21.627683 | orchestrator | 18:30:21.627 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creation complete after 5s [id=bf2a26e6-dce5-49e7-9e2e-413a51d61d35] 2025-09-23 18:30:21.641112 | orchestrator | 18:30:21.640 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2025-09-23 18:30:21.656739 | orchestrator | 18:30:21.656 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creating... 2025-09-23 18:30:21.658081 | orchestrator | 18:30:21.657 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creating... 2025-09-23 18:30:21.675425 | orchestrator | 18:30:21.674 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creating... 2025-09-23 18:30:21.675488 | orchestrator | 18:30:21.674 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creating... 2025-09-23 18:30:21.678029 | orchestrator | 18:30:21.677 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creating... 2025-09-23 18:30:21.679015 | orchestrator | 18:30:21.678 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creating... 2025-09-23 18:30:23.895965 | orchestrator | 18:30:23.895 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 2s [id=719a8355-9cd4-4f2c-b2e9-3d5acab33664] 2025-09-23 18:30:23.902395 | orchestrator | 18:30:23.901 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2025-09-23 18:30:23.912962 | orchestrator | 18:30:23.912 STDOUT terraform: local_file.MANAGER_ADDRESS: Creating... 2025-09-23 18:30:23.913369 | orchestrator | 18:30:23.913 STDOUT terraform: local_file.inventory: Creating... 2025-09-23 18:30:23.921261 | orchestrator | 18:30:23.921 STDOUT terraform: local_file.MANAGER_ADDRESS: Creation complete after 0s [id=45839e49d507f5536ad5565d4d71e5ef10ecf5bd] 2025-09-23 18:30:23.921686 | orchestrator | 18:30:23.921 STDOUT terraform: local_file.inventory: Creation complete after 0s [id=d984f9226134af21f1c397fca19cda43017836e7] 2025-09-23 18:30:24.790123 | orchestrator | 18:30:24.789 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 1s [id=719a8355-9cd4-4f2c-b2e9-3d5acab33664] 2025-09-23 18:30:31.663499 | orchestrator | 18:30:31.663 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2025-09-23 18:30:31.663596 | orchestrator | 18:30:31.663 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2025-09-23 18:30:31.677651 | orchestrator | 18:30:31.677 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2025-09-23 18:30:31.677798 | orchestrator | 18:30:31.677 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2025-09-23 18:30:31.680832 | orchestrator | 18:30:31.680 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2025-09-23 18:30:31.683207 | orchestrator | 18:30:31.682 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2025-09-23 18:30:41.664152 | orchestrator | 18:30:41.663 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2025-09-23 18:30:41.664298 | orchestrator | 18:30:41.664 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2025-09-23 18:30:41.678341 | orchestrator | 18:30:41.678 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2025-09-23 18:30:41.678474 | orchestrator | 18:30:41.678 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2025-09-23 18:30:41.681609 | orchestrator | 18:30:41.681 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2025-09-23 18:30:41.684036 | orchestrator | 18:30:41.683 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2025-09-23 18:30:42.299474 | orchestrator | 18:30:42.299 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creation complete after 20s [id=311cb270-5c95-41a2-9b6a-4c57fcee8bf0] 2025-09-23 18:30:51.664594 | orchestrator | 18:30:51.664 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [30s elapsed] 2025-09-23 18:30:51.678654 | orchestrator | 18:30:51.678 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [30s elapsed] 2025-09-23 18:30:51.678825 | orchestrator | 18:30:51.678 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [30s elapsed] 2025-09-23 18:30:51.681958 | orchestrator | 18:30:51.681 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [30s elapsed] 2025-09-23 18:30:51.684259 | orchestrator | 18:30:51.684 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2025-09-23 18:30:52.451360 | orchestrator | 18:30:52.450 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creation complete after 30s [id=8e8699dd-8b9c-46dd-89af-491ba7373a53] 2025-09-23 18:30:52.507727 | orchestrator | 18:30:52.507 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creation complete after 31s [id=605323a2-2ea6-4f28-a55e-7426175c786e] 2025-09-23 18:30:52.520207 | orchestrator | 18:30:52.519 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creation complete after 31s [id=e00fd445-10a2-4a06-a029-8d1e963b5938] 2025-09-23 18:30:52.593641 | orchestrator | 18:30:52.593 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creation complete after 31s [id=3e610cf3-2629-4783-8598-a793dd120743] 2025-09-23 18:30:52.625706 | orchestrator | 18:30:52.625 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creation complete after 31s [id=cca01c99-3a12-4026-b0bf-4fe236925379] 2025-09-23 18:30:52.646477 | orchestrator | 18:30:52.646 STDOUT terraform: null_resource.node_semaphore: Creating... 2025-09-23 18:30:52.653868 | orchestrator | 18:30:52.653 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2025-09-23 18:30:52.655281 | orchestrator | 18:30:52.655 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2025-09-23 18:30:52.663130 | orchestrator | 18:30:52.662 STDOUT terraform: null_resource.node_semaphore: Creation complete after 0s [id=5079845576596949084] 2025-09-23 18:30:52.667105 | orchestrator | 18:30:52.666 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2025-09-23 18:30:52.667213 | orchestrator | 18:30:52.667 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2025-09-23 18:30:52.667334 | orchestrator | 18:30:52.667 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2025-09-23 18:30:52.686131 | orchestrator | 18:30:52.685 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2025-09-23 18:30:52.687230 | orchestrator | 18:30:52.687 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2025-09-23 18:30:52.687259 | orchestrator | 18:30:52.687 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2025-09-23 18:30:52.688855 | orchestrator | 18:30:52.688 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2025-09-23 18:30:52.702526 | orchestrator | 18:30:52.702 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creating... 2025-09-23 18:30:56.088436 | orchestrator | 18:30:56.088 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 3s [id=8e8699dd-8b9c-46dd-89af-491ba7373a53/8164be3f-bf64-45a9-9145-7091701f0cb6] 2025-09-23 18:30:56.116185 | orchestrator | 18:30:56.115 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 3s [id=311cb270-5c95-41a2-9b6a-4c57fcee8bf0/e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8] 2025-09-23 18:30:56.129511 | orchestrator | 18:30:56.129 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 3s [id=cca01c99-3a12-4026-b0bf-4fe236925379/fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e] 2025-09-23 18:30:56.160259 | orchestrator | 18:30:56.159 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 3s [id=8e8699dd-8b9c-46dd-89af-491ba7373a53/d82469de-3742-489b-9a9c-b38cbdf5a8bd] 2025-09-23 18:30:56.192190 | orchestrator | 18:30:56.191 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 3s [id=311cb270-5c95-41a2-9b6a-4c57fcee8bf0/2f832cfd-0250-47f3-a635-d697408042bd] 2025-09-23 18:31:02.219458 | orchestrator | 18:31:02.218 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 9s [id=cca01c99-3a12-4026-b0bf-4fe236925379/8202d0db-f0b8-43bb-b5ae-a89817ca1052] 2025-09-23 18:31:02.275996 | orchestrator | 18:31:02.275 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 9s [id=8e8699dd-8b9c-46dd-89af-491ba7373a53/c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5] 2025-09-23 18:31:02.325280 | orchestrator | 18:31:02.324 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 9s [id=cca01c99-3a12-4026-b0bf-4fe236925379/d7d70b4c-e10d-4821-8a70-30b75615b27b] 2025-09-23 18:31:02.363937 | orchestrator | 18:31:02.363 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 9s [id=311cb270-5c95-41a2-9b6a-4c57fcee8bf0/ad3d32bb-3e57-4330-95b4-3d115fcffc85] 2025-09-23 18:31:02.711073 | orchestrator | 18:31:02.710 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2025-09-23 18:31:12.712181 | orchestrator | 18:31:12.711 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2025-09-23 18:31:13.374243 | orchestrator | 18:31:13.373 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creation complete after 20s [id=79e9794d-2a66-4974-ba5a-f2b25e7fac5b] 2025-09-23 18:31:13.387346 | orchestrator | 18:31:13.387 STDOUT terraform: Apply complete! Resources: 64 added, 0 changed, 0 destroyed. 2025-09-23 18:31:13.387416 | orchestrator | 18:31:13.387 STDOUT terraform: Outputs: 2025-09-23 18:31:13.387439 | orchestrator | 18:31:13.387 STDOUT terraform: manager_address = 2025-09-23 18:31:13.387446 | orchestrator | 18:31:13.387 STDOUT terraform: private_key = 2025-09-23 18:31:13.464214 | orchestrator | ok: Runtime: 0:01:10.597178 2025-09-23 18:31:13.489943 | 2025-09-23 18:31:13.490068 | TASK [Create infrastructure (stable)] 2025-09-23 18:31:14.024403 | orchestrator | skipping: Conditional result was False 2025-09-23 18:31:14.041100 | 2025-09-23 18:31:14.041254 | TASK [Fetch manager address] 2025-09-23 18:31:14.463107 | orchestrator | ok 2025-09-23 18:31:14.472257 | 2025-09-23 18:31:14.472376 | TASK [Set manager_host address] 2025-09-23 18:31:14.552633 | orchestrator | ok 2025-09-23 18:31:14.562190 | 2025-09-23 18:31:14.562316 | LOOP [Update ansible collections] 2025-09-23 18:31:15.401211 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2025-09-23 18:31:15.401725 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-23 18:31:15.401805 | orchestrator | Starting galaxy collection install process 2025-09-23 18:31:15.401849 | orchestrator | Process install dependency map 2025-09-23 18:31:15.401965 | orchestrator | Starting collection install process 2025-09-23 18:31:15.402012 | orchestrator | Installing 'osism.commons:999.0.0' to '/home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons' 2025-09-23 18:31:15.402053 | orchestrator | Created collection for osism.commons:999.0.0 at /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons 2025-09-23 18:31:15.402093 | orchestrator | osism.commons:999.0.0 was installed successfully 2025-09-23 18:31:15.402167 | orchestrator | ok: Item: commons Runtime: 0:00:00.534153 2025-09-23 18:31:16.287205 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2025-09-23 18:31:16.287433 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-23 18:31:16.287504 | orchestrator | Starting galaxy collection install process 2025-09-23 18:31:16.287558 | orchestrator | Process install dependency map 2025-09-23 18:31:16.287606 | orchestrator | Starting collection install process 2025-09-23 18:31:16.287652 | orchestrator | Installing 'osism.services:999.0.0' to '/home/zuul-testbed02/.ansible/collections/ansible_collections/osism/services' 2025-09-23 18:31:16.287698 | orchestrator | Created collection for osism.services:999.0.0 at /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/services 2025-09-23 18:31:16.287743 | orchestrator | osism.services:999.0.0 was installed successfully 2025-09-23 18:31:16.287807 | orchestrator | ok: Item: services Runtime: 0:00:00.628164 2025-09-23 18:31:16.307751 | 2025-09-23 18:31:16.307921 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-09-23 18:31:26.846963 | orchestrator | ok 2025-09-23 18:31:26.862029 | 2025-09-23 18:31:26.862158 | TASK [Wait a little longer for the manager so that everything is ready] 2025-09-23 18:32:26.908372 | orchestrator | ok 2025-09-23 18:32:26.916418 | 2025-09-23 18:32:26.916530 | TASK [Fetch manager ssh hostkey] 2025-09-23 18:32:28.494029 | orchestrator | Output suppressed because no_log was given 2025-09-23 18:32:28.504451 | 2025-09-23 18:32:28.504603 | TASK [Get ssh keypair from terraform environment] 2025-09-23 18:32:29.040312 | orchestrator | ok: Runtime: 0:00:00.009774 2025-09-23 18:32:29.050354 | 2025-09-23 18:32:29.050548 | TASK [Point out that the following task takes some time and does not give any output] 2025-09-23 18:32:29.100309 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2025-09-23 18:32:29.111472 | 2025-09-23 18:32:29.111667 | TASK [Run manager part 0] 2025-09-23 18:32:29.942072 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-23 18:32:29.985373 | orchestrator | 2025-09-23 18:32:29.985416 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2025-09-23 18:32:29.985423 | orchestrator | 2025-09-23 18:32:29.985435 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2025-09-23 18:32:31.681742 | orchestrator | ok: [testbed-manager] 2025-09-23 18:32:31.681780 | orchestrator | 2025-09-23 18:32:31.681799 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-09-23 18:32:31.681809 | orchestrator | 2025-09-23 18:32:31.681817 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:32:33.486217 | orchestrator | ok: [testbed-manager] 2025-09-23 18:32:33.486291 | orchestrator | 2025-09-23 18:32:33.486299 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-09-23 18:32:34.123466 | orchestrator | ok: [testbed-manager] 2025-09-23 18:32:34.123514 | orchestrator | 2025-09-23 18:32:34.123522 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-09-23 18:32:34.162404 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.162449 | orchestrator | 2025-09-23 18:32:34.162458 | orchestrator | TASK [Update package cache] **************************************************** 2025-09-23 18:32:34.187769 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.187816 | orchestrator | 2025-09-23 18:32:34.187823 | orchestrator | TASK [Install required packages] *********************************************** 2025-09-23 18:32:34.213069 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.213115 | orchestrator | 2025-09-23 18:32:34.213121 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-09-23 18:32:34.239652 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.239694 | orchestrator | 2025-09-23 18:32:34.239700 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-09-23 18:32:34.264805 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.264849 | orchestrator | 2025-09-23 18:32:34.264856 | orchestrator | TASK [Fail if Ubuntu version is lower than 22.04] ****************************** 2025-09-23 18:32:34.297054 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.297099 | orchestrator | 2025-09-23 18:32:34.297107 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2025-09-23 18:32:34.328327 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:32:34.328372 | orchestrator | 2025-09-23 18:32:34.328378 | orchestrator | TASK [Set APT options on manager] ********************************************** 2025-09-23 18:32:35.051558 | orchestrator | changed: [testbed-manager] 2025-09-23 18:32:35.051607 | orchestrator | 2025-09-23 18:32:35.051615 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2025-09-23 18:35:02.821631 | orchestrator | changed: [testbed-manager] 2025-09-23 18:35:02.821709 | orchestrator | 2025-09-23 18:35:02.821725 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-09-23 18:36:33.371507 | orchestrator | changed: [testbed-manager] 2025-09-23 18:36:33.371602 | orchestrator | 2025-09-23 18:36:33.371620 | orchestrator | TASK [Install required packages] *********************************************** 2025-09-23 18:36:53.825203 | orchestrator | changed: [testbed-manager] 2025-09-23 18:36:53.825303 | orchestrator | 2025-09-23 18:36:53.825321 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-09-23 18:37:02.359918 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:02.360643 | orchestrator | 2025-09-23 18:37:02.360664 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-09-23 18:37:02.405299 | orchestrator | ok: [testbed-manager] 2025-09-23 18:37:02.405371 | orchestrator | 2025-09-23 18:37:02.405385 | orchestrator | TASK [Get current user] ******************************************************** 2025-09-23 18:37:03.173513 | orchestrator | ok: [testbed-manager] 2025-09-23 18:37:03.173603 | orchestrator | 2025-09-23 18:37:03.173621 | orchestrator | TASK [Create venv directory] *************************************************** 2025-09-23 18:37:03.912598 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:03.912674 | orchestrator | 2025-09-23 18:37:03.912689 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2025-09-23 18:37:10.246462 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:10.246557 | orchestrator | 2025-09-23 18:37:10.246608 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2025-09-23 18:37:15.994234 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:15.994327 | orchestrator | 2025-09-23 18:37:15.994346 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2025-09-23 18:37:18.579917 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:18.579962 | orchestrator | 2025-09-23 18:37:18.579973 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2025-09-23 18:37:20.360127 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:20.360217 | orchestrator | 2025-09-23 18:37:20.360234 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2025-09-23 18:37:21.456485 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-09-23 18:37:21.456581 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-09-23 18:37:21.456597 | orchestrator | 2025-09-23 18:37:21.456610 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2025-09-23 18:37:21.499680 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-09-23 18:37:21.499774 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-09-23 18:37:21.499799 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-09-23 18:37:21.499820 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-09-23 18:37:24.674060 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-09-23 18:37:24.674108 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-09-23 18:37:24.674115 | orchestrator | 2025-09-23 18:37:24.674121 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2025-09-23 18:37:25.166413 | orchestrator | changed: [testbed-manager] 2025-09-23 18:37:25.166501 | orchestrator | 2025-09-23 18:37:25.166515 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2025-09-23 18:39:45.769669 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2025-09-23 18:39:45.769767 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2025-09-23 18:39:45.769784 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2025-09-23 18:39:45.769797 | orchestrator | 2025-09-23 18:39:45.769810 | orchestrator | TASK [Install local collections] *********************************************** 2025-09-23 18:39:48.062558 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2025-09-23 18:39:48.062642 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2025-09-23 18:39:48.062657 | orchestrator | 2025-09-23 18:39:48.062670 | orchestrator | PLAY [Create operator user] **************************************************** 2025-09-23 18:39:48.062682 | orchestrator | 2025-09-23 18:39:48.062694 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:39:49.423704 | orchestrator | ok: [testbed-manager] 2025-09-23 18:39:49.423847 | orchestrator | 2025-09-23 18:39:49.423868 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-09-23 18:39:49.465681 | orchestrator | ok: [testbed-manager] 2025-09-23 18:39:49.465770 | orchestrator | 2025-09-23 18:39:49.465786 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-09-23 18:39:49.525948 | orchestrator | ok: [testbed-manager] 2025-09-23 18:39:49.526088 | orchestrator | 2025-09-23 18:39:49.526110 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-09-23 18:39:50.333952 | orchestrator | changed: [testbed-manager] 2025-09-23 18:39:50.333996 | orchestrator | 2025-09-23 18:39:50.334005 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-09-23 18:39:51.059571 | orchestrator | changed: [testbed-manager] 2025-09-23 18:39:51.059658 | orchestrator | 2025-09-23 18:39:51.059673 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-09-23 18:39:52.425148 | orchestrator | changed: [testbed-manager] => (item=adm) 2025-09-23 18:39:52.425184 | orchestrator | changed: [testbed-manager] => (item=sudo) 2025-09-23 18:39:52.425191 | orchestrator | 2025-09-23 18:39:52.425203 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-09-23 18:39:53.811104 | orchestrator | changed: [testbed-manager] 2025-09-23 18:39:53.811206 | orchestrator | 2025-09-23 18:39:53.811223 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-09-23 18:39:55.572117 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:39:55.572194 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2025-09-23 18:39:55.572208 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:39:55.572219 | orchestrator | 2025-09-23 18:39:55.572233 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2025-09-23 18:39:55.629378 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:39:55.629475 | orchestrator | 2025-09-23 18:39:55.629491 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-09-23 18:39:56.191338 | orchestrator | changed: [testbed-manager] 2025-09-23 18:39:56.191436 | orchestrator | 2025-09-23 18:39:56.191455 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-09-23 18:39:56.259865 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:39:56.259913 | orchestrator | 2025-09-23 18:39:56.259919 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-09-23 18:39:57.107165 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-23 18:39:57.107207 | orchestrator | changed: [testbed-manager] 2025-09-23 18:39:57.107216 | orchestrator | 2025-09-23 18:39:57.107223 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-09-23 18:39:57.143061 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:39:57.143132 | orchestrator | 2025-09-23 18:39:57.143147 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-09-23 18:39:57.173822 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:39:57.173879 | orchestrator | 2025-09-23 18:39:57.173893 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-09-23 18:39:57.202969 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:39:57.203019 | orchestrator | 2025-09-23 18:39:57.203032 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-09-23 18:39:57.244651 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:39:57.244733 | orchestrator | 2025-09-23 18:39:57.244751 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-09-23 18:39:57.963462 | orchestrator | ok: [testbed-manager] 2025-09-23 18:39:57.963549 | orchestrator | 2025-09-23 18:39:57.963565 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-09-23 18:39:57.963578 | orchestrator | 2025-09-23 18:39:57.963589 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:39:59.384377 | orchestrator | ok: [testbed-manager] 2025-09-23 18:39:59.384586 | orchestrator | 2025-09-23 18:39:59.384604 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2025-09-23 18:40:00.344614 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:00.344677 | orchestrator | 2025-09-23 18:40:00.344686 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:40:00.344695 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=13 rescued=0 ignored=0 2025-09-23 18:40:00.344703 | orchestrator | 2025-09-23 18:40:00.918229 | orchestrator | ok: Runtime: 0:07:31.032372 2025-09-23 18:40:00.937470 | 2025-09-23 18:40:00.937618 | TASK [Point out that the log in on the manager is now possible] 2025-09-23 18:40:00.970547 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2025-09-23 18:40:00.977544 | 2025-09-23 18:40:00.977641 | TASK [Point out that the following task takes some time and does not give any output] 2025-09-23 18:40:01.007369 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2025-09-23 18:40:01.015922 | 2025-09-23 18:40:01.016071 | TASK [Run manager part 1 + 2] 2025-09-23 18:40:01.876345 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-23 18:40:01.928498 | orchestrator | 2025-09-23 18:40:01.928549 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2025-09-23 18:40:01.928557 | orchestrator | 2025-09-23 18:40:01.928569 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:40:04.807921 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:04.807973 | orchestrator | 2025-09-23 18:40:04.807996 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-09-23 18:40:04.845724 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:40:04.845778 | orchestrator | 2025-09-23 18:40:04.845790 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-09-23 18:40:04.892523 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:04.892576 | orchestrator | 2025-09-23 18:40:04.892586 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-09-23 18:40:04.941238 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:04.941285 | orchestrator | 2025-09-23 18:40:04.941295 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-09-23 18:40:05.004810 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:05.004866 | orchestrator | 2025-09-23 18:40:05.004877 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-09-23 18:40:05.061271 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:05.061323 | orchestrator | 2025-09-23 18:40:05.061333 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-09-23 18:40:05.101952 | orchestrator | included: /home/zuul-testbed02/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2025-09-23 18:40:05.101995 | orchestrator | 2025-09-23 18:40:05.102001 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-09-23 18:40:05.816092 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:05.816148 | orchestrator | 2025-09-23 18:40:05.816159 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-09-23 18:40:05.860222 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:40:05.860275 | orchestrator | 2025-09-23 18:40:05.860284 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-09-23 18:40:07.147394 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:07.147507 | orchestrator | 2025-09-23 18:40:07.147519 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-09-23 18:40:07.678640 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:07.678683 | orchestrator | 2025-09-23 18:40:07.678692 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-09-23 18:40:08.703896 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:08.703935 | orchestrator | 2025-09-23 18:40:08.703945 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-09-23 18:40:23.696071 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:23.696137 | orchestrator | 2025-09-23 18:40:23.696153 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-09-23 18:40:24.312163 | orchestrator | ok: [testbed-manager] 2025-09-23 18:40:24.312237 | orchestrator | 2025-09-23 18:40:24.312254 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-09-23 18:40:24.363154 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:40:24.363222 | orchestrator | 2025-09-23 18:40:24.363237 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2025-09-23 18:40:25.279055 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:25.279128 | orchestrator | 2025-09-23 18:40:25.279145 | orchestrator | TASK [Copy SSH private key] **************************************************** 2025-09-23 18:40:26.201912 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:26.201948 | orchestrator | 2025-09-23 18:40:26.201956 | orchestrator | TASK [Create configuration directory] ****************************************** 2025-09-23 18:40:26.760368 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:26.760481 | orchestrator | 2025-09-23 18:40:26.760498 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2025-09-23 18:40:26.800945 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-09-23 18:40:26.801050 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-09-23 18:40:26.801065 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-09-23 18:40:26.801077 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-09-23 18:40:28.685214 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:28.685311 | orchestrator | 2025-09-23 18:40:28.685330 | orchestrator | TASK [Install python requirements in venv] ************************************* 2025-09-23 18:40:37.498826 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2025-09-23 18:40:37.498858 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2025-09-23 18:40:37.498864 | orchestrator | ok: [testbed-manager] => (item=packaging) 2025-09-23 18:40:37.498869 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2025-09-23 18:40:37.498876 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2025-09-23 18:40:37.498881 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2025-09-23 18:40:37.498885 | orchestrator | 2025-09-23 18:40:37.498890 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2025-09-23 18:40:38.528324 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:38.528392 | orchestrator | 2025-09-23 18:40:38.528431 | orchestrator | TASK [Copy testbed custom CA certificate on CentOS] **************************** 2025-09-23 18:40:38.571798 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:40:38.571855 | orchestrator | 2025-09-23 18:40:38.571864 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2025-09-23 18:40:41.490341 | orchestrator | changed: [testbed-manager] 2025-09-23 18:40:41.491192 | orchestrator | 2025-09-23 18:40:41.491230 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2025-09-23 18:40:41.520933 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:40:41.520981 | orchestrator | 2025-09-23 18:40:41.520991 | orchestrator | TASK [Run manager part 2] ****************************************************** 2025-09-23 18:42:14.803832 | orchestrator | changed: [testbed-manager] 2025-09-23 18:42:14.803930 | orchestrator | 2025-09-23 18:42:14.803948 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-09-23 18:42:15.898422 | orchestrator | ok: [testbed-manager] 2025-09-23 18:42:15.898494 | orchestrator | 2025-09-23 18:42:15.898511 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:42:15.898526 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 2025-09-23 18:42:15.898537 | orchestrator | 2025-09-23 18:42:16.139842 | orchestrator | ok: Runtime: 0:02:14.623854 2025-09-23 18:42:16.155369 | 2025-09-23 18:42:16.155503 | TASK [Reboot manager] 2025-09-23 18:42:17.689698 | orchestrator | ok: Runtime: 0:00:00.955091 2025-09-23 18:42:17.706571 | 2025-09-23 18:42:17.706713 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-09-23 18:42:32.202271 | orchestrator | ok 2025-09-23 18:42:32.212475 | 2025-09-23 18:42:32.212610 | TASK [Wait a little longer for the manager so that everything is ready] 2025-09-23 18:43:32.260226 | orchestrator | ok 2025-09-23 18:43:32.271511 | 2025-09-23 18:43:32.271645 | TASK [Deploy manager + bootstrap nodes] 2025-09-23 18:43:34.882068 | orchestrator | 2025-09-23 18:43:34.882281 | orchestrator | # DEPLOY MANAGER 2025-09-23 18:43:34.882319 | orchestrator | 2025-09-23 18:43:34.882336 | orchestrator | + set -e 2025-09-23 18:43:34.882351 | orchestrator | + echo 2025-09-23 18:43:34.882366 | orchestrator | + echo '# DEPLOY MANAGER' 2025-09-23 18:43:34.882416 | orchestrator | + echo 2025-09-23 18:43:34.882472 | orchestrator | + cat /opt/manager-vars.sh 2025-09-23 18:43:34.885340 | orchestrator | export NUMBER_OF_NODES=6 2025-09-23 18:43:34.885414 | orchestrator | 2025-09-23 18:43:34.885421 | orchestrator | export CEPH_VERSION=reef 2025-09-23 18:43:34.885428 | orchestrator | export CONFIGURATION_VERSION=main 2025-09-23 18:43:34.885435 | orchestrator | export MANAGER_VERSION=latest 2025-09-23 18:43:34.885449 | orchestrator | export OPENSTACK_VERSION=2024.2 2025-09-23 18:43:34.885454 | orchestrator | 2025-09-23 18:43:34.885462 | orchestrator | export ARA=false 2025-09-23 18:43:34.885466 | orchestrator | export DEPLOY_MODE=manager 2025-09-23 18:43:34.885475 | orchestrator | export TEMPEST=false 2025-09-23 18:43:34.885479 | orchestrator | export IS_ZUUL=true 2025-09-23 18:43:34.885484 | orchestrator | 2025-09-23 18:43:34.885491 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 18:43:34.885496 | orchestrator | export EXTERNAL_API=false 2025-09-23 18:43:34.885501 | orchestrator | 2025-09-23 18:43:34.885505 | orchestrator | export IMAGE_USER=ubuntu 2025-09-23 18:43:34.885512 | orchestrator | export IMAGE_NODE_USER=ubuntu 2025-09-23 18:43:34.885516 | orchestrator | 2025-09-23 18:43:34.885520 | orchestrator | export CEPH_STACK=ceph-ansible 2025-09-23 18:43:34.885764 | orchestrator | 2025-09-23 18:43:34.885771 | orchestrator | + echo 2025-09-23 18:43:34.885776 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-09-23 18:43:34.886882 | orchestrator | ++ export INTERACTIVE=false 2025-09-23 18:43:34.886897 | orchestrator | ++ INTERACTIVE=false 2025-09-23 18:43:34.886905 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-09-23 18:43:34.886913 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-09-23 18:43:34.887219 | orchestrator | + source /opt/manager-vars.sh 2025-09-23 18:43:34.887226 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-09-23 18:43:34.887231 | orchestrator | ++ NUMBER_OF_NODES=6 2025-09-23 18:43:34.887237 | orchestrator | ++ export CEPH_VERSION=reef 2025-09-23 18:43:34.887241 | orchestrator | ++ CEPH_VERSION=reef 2025-09-23 18:43:34.887246 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-09-23 18:43:34.887250 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-09-23 18:43:34.887302 | orchestrator | ++ export MANAGER_VERSION=latest 2025-09-23 18:43:34.887308 | orchestrator | ++ MANAGER_VERSION=latest 2025-09-23 18:43:34.887313 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2025-09-23 18:43:34.887323 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2025-09-23 18:43:34.887329 | orchestrator | ++ export ARA=false 2025-09-23 18:43:34.887334 | orchestrator | ++ ARA=false 2025-09-23 18:43:34.887536 | orchestrator | ++ export DEPLOY_MODE=manager 2025-09-23 18:43:34.887543 | orchestrator | ++ DEPLOY_MODE=manager 2025-09-23 18:43:34.887548 | orchestrator | ++ export TEMPEST=false 2025-09-23 18:43:34.887552 | orchestrator | ++ TEMPEST=false 2025-09-23 18:43:34.887556 | orchestrator | ++ export IS_ZUUL=true 2025-09-23 18:43:34.887560 | orchestrator | ++ IS_ZUUL=true 2025-09-23 18:43:34.887566 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 18:43:34.887597 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 18:43:34.887785 | orchestrator | ++ export EXTERNAL_API=false 2025-09-23 18:43:34.887791 | orchestrator | ++ EXTERNAL_API=false 2025-09-23 18:43:34.887796 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-09-23 18:43:34.887800 | orchestrator | ++ IMAGE_USER=ubuntu 2025-09-23 18:43:34.887857 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-09-23 18:43:34.887865 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-09-23 18:43:34.887951 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-09-23 18:43:34.887957 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-09-23 18:43:34.887963 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2025-09-23 18:43:34.953355 | orchestrator | + docker version 2025-09-23 18:43:35.286661 | orchestrator | Client: Docker Engine - Community 2025-09-23 18:43:35.286755 | orchestrator | Version: 27.5.1 2025-09-23 18:43:35.286768 | orchestrator | API version: 1.47 2025-09-23 18:43:35.286776 | orchestrator | Go version: go1.22.11 2025-09-23 18:43:35.286783 | orchestrator | Git commit: 9f9e405 2025-09-23 18:43:35.286791 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2025-09-23 18:43:35.286800 | orchestrator | OS/Arch: linux/amd64 2025-09-23 18:43:35.286808 | orchestrator | Context: default 2025-09-23 18:43:35.286815 | orchestrator | 2025-09-23 18:43:35.286823 | orchestrator | Server: Docker Engine - Community 2025-09-23 18:43:35.286830 | orchestrator | Engine: 2025-09-23 18:43:35.286839 | orchestrator | Version: 27.5.1 2025-09-23 18:43:35.286846 | orchestrator | API version: 1.47 (minimum version 1.24) 2025-09-23 18:43:35.286877 | orchestrator | Go version: go1.22.11 2025-09-23 18:43:35.286885 | orchestrator | Git commit: 4c9b3b0 2025-09-23 18:43:35.286893 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2025-09-23 18:43:35.286900 | orchestrator | OS/Arch: linux/amd64 2025-09-23 18:43:35.286907 | orchestrator | Experimental: false 2025-09-23 18:43:35.286915 | orchestrator | containerd: 2025-09-23 18:43:35.286923 | orchestrator | Version: 1.7.27 2025-09-23 18:43:35.286930 | orchestrator | GitCommit: 05044ec0a9a75232cad458027ca83437aae3f4da 2025-09-23 18:43:35.286938 | orchestrator | runc: 2025-09-23 18:43:35.286945 | orchestrator | Version: 1.2.5 2025-09-23 18:43:35.286953 | orchestrator | GitCommit: v1.2.5-0-g59923ef 2025-09-23 18:43:35.286960 | orchestrator | docker-init: 2025-09-23 18:43:35.286967 | orchestrator | Version: 0.19.0 2025-09-23 18:43:35.286975 | orchestrator | GitCommit: de40ad0 2025-09-23 18:43:35.290346 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2025-09-23 18:43:35.298742 | orchestrator | + set -e 2025-09-23 18:43:35.298795 | orchestrator | + source /opt/manager-vars.sh 2025-09-23 18:43:35.298804 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-09-23 18:43:35.298813 | orchestrator | ++ NUMBER_OF_NODES=6 2025-09-23 18:43:35.298822 | orchestrator | ++ export CEPH_VERSION=reef 2025-09-23 18:43:35.298830 | orchestrator | ++ CEPH_VERSION=reef 2025-09-23 18:43:35.298839 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-09-23 18:43:35.298848 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-09-23 18:43:35.298857 | orchestrator | ++ export MANAGER_VERSION=latest 2025-09-23 18:43:35.298867 | orchestrator | ++ MANAGER_VERSION=latest 2025-09-23 18:43:35.298875 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2025-09-23 18:43:35.298884 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2025-09-23 18:43:35.298893 | orchestrator | ++ export ARA=false 2025-09-23 18:43:35.298901 | orchestrator | ++ ARA=false 2025-09-23 18:43:35.298910 | orchestrator | ++ export DEPLOY_MODE=manager 2025-09-23 18:43:35.298919 | orchestrator | ++ DEPLOY_MODE=manager 2025-09-23 18:43:35.298928 | orchestrator | ++ export TEMPEST=false 2025-09-23 18:43:35.298937 | orchestrator | ++ TEMPEST=false 2025-09-23 18:43:35.298945 | orchestrator | ++ export IS_ZUUL=true 2025-09-23 18:43:35.298954 | orchestrator | ++ IS_ZUUL=true 2025-09-23 18:43:35.298962 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 18:43:35.298971 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 18:43:35.298980 | orchestrator | ++ export EXTERNAL_API=false 2025-09-23 18:43:35.298988 | orchestrator | ++ EXTERNAL_API=false 2025-09-23 18:43:35.298997 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-09-23 18:43:35.299005 | orchestrator | ++ IMAGE_USER=ubuntu 2025-09-23 18:43:35.299014 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-09-23 18:43:35.299022 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-09-23 18:43:35.299031 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-09-23 18:43:35.299040 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-09-23 18:43:35.299050 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-09-23 18:43:35.299060 | orchestrator | ++ export INTERACTIVE=false 2025-09-23 18:43:35.299071 | orchestrator | ++ INTERACTIVE=false 2025-09-23 18:43:35.299082 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-09-23 18:43:35.299099 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-09-23 18:43:35.299119 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2025-09-23 18:43:35.299130 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-23 18:43:35.299141 | orchestrator | + /opt/configuration/scripts/set-ceph-version.sh reef 2025-09-23 18:43:35.307294 | orchestrator | + set -e 2025-09-23 18:43:35.307335 | orchestrator | + VERSION=reef 2025-09-23 18:43:35.307704 | orchestrator | ++ grep '^ceph_version:' /opt/configuration/environments/manager/configuration.yml 2025-09-23 18:43:35.314520 | orchestrator | + [[ -n ceph_version: reef ]] 2025-09-23 18:43:35.314562 | orchestrator | + sed -i 's/ceph_version: .*/ceph_version: reef/g' /opt/configuration/environments/manager/configuration.yml 2025-09-23 18:43:35.319575 | orchestrator | + /opt/configuration/scripts/set-openstack-version.sh 2024.2 2025-09-23 18:43:35.326915 | orchestrator | + set -e 2025-09-23 18:43:35.327496 | orchestrator | + VERSION=2024.2 2025-09-23 18:43:35.327968 | orchestrator | ++ grep '^openstack_version:' /opt/configuration/environments/manager/configuration.yml 2025-09-23 18:43:35.332256 | orchestrator | + [[ -n openstack_version: 2024.2 ]] 2025-09-23 18:43:35.332302 | orchestrator | + sed -i 's/openstack_version: .*/openstack_version: 2024.2/g' /opt/configuration/environments/manager/configuration.yml 2025-09-23 18:43:35.336822 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2025-09-23 18:43:35.337869 | orchestrator | ++ semver latest 7.0.0 2025-09-23 18:43:35.401041 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-23 18:43:35.401129 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-23 18:43:35.401143 | orchestrator | + echo 'enable_osism_kubernetes: true' 2025-09-23 18:43:35.401156 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2025-09-23 18:43:35.503045 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-09-23 18:43:35.509206 | orchestrator | + source /opt/venv/bin/activate 2025-09-23 18:43:35.510084 | orchestrator | ++ deactivate nondestructive 2025-09-23 18:43:35.510148 | orchestrator | ++ '[' -n '' ']' 2025-09-23 18:43:35.510169 | orchestrator | ++ '[' -n '' ']' 2025-09-23 18:43:35.510199 | orchestrator | ++ hash -r 2025-09-23 18:43:35.510427 | orchestrator | ++ '[' -n '' ']' 2025-09-23 18:43:35.510464 | orchestrator | ++ unset VIRTUAL_ENV 2025-09-23 18:43:35.510481 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2025-09-23 18:43:35.510508 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2025-09-23 18:43:35.510527 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2025-09-23 18:43:35.510563 | orchestrator | ++ '[' linux-gnu = msys ']' 2025-09-23 18:43:35.510583 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2025-09-23 18:43:35.510600 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2025-09-23 18:43:35.510618 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-09-23 18:43:35.510636 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-09-23 18:43:35.510653 | orchestrator | ++ export PATH 2025-09-23 18:43:35.510670 | orchestrator | ++ '[' -n '' ']' 2025-09-23 18:43:35.510726 | orchestrator | ++ '[' -z '' ']' 2025-09-23 18:43:35.510754 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2025-09-23 18:43:35.510777 | orchestrator | ++ PS1='(venv) ' 2025-09-23 18:43:35.510795 | orchestrator | ++ export PS1 2025-09-23 18:43:35.510812 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2025-09-23 18:43:35.510830 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2025-09-23 18:43:35.510847 | orchestrator | ++ hash -r 2025-09-23 18:43:35.511111 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2025-09-23 18:43:36.841972 | orchestrator | 2025-09-23 18:43:36.842160 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2025-09-23 18:43:36.842178 | orchestrator | 2025-09-23 18:43:36.842190 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-09-23 18:43:37.419917 | orchestrator | ok: [testbed-manager] 2025-09-23 18:43:37.420027 | orchestrator | 2025-09-23 18:43:37.420044 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-09-23 18:43:38.435847 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:38.435963 | orchestrator | 2025-09-23 18:43:38.435981 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2025-09-23 18:43:38.435994 | orchestrator | 2025-09-23 18:43:38.436006 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:43:40.796946 | orchestrator | ok: [testbed-manager] 2025-09-23 18:43:40.797062 | orchestrator | 2025-09-23 18:43:40.797079 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2025-09-23 18:43:40.843571 | orchestrator | ok: [testbed-manager] 2025-09-23 18:43:40.843642 | orchestrator | 2025-09-23 18:43:40.843662 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2025-09-23 18:43:41.304939 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:41.305047 | orchestrator | 2025-09-23 18:43:41.305063 | orchestrator | TASK [Add netbox_enable parameter] ********************************************* 2025-09-23 18:43:41.350788 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:43:41.350889 | orchestrator | 2025-09-23 18:43:41.350904 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-09-23 18:43:41.682878 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:41.682976 | orchestrator | 2025-09-23 18:43:41.682991 | orchestrator | TASK [Use insecure glance configuration] *************************************** 2025-09-23 18:43:41.735967 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:43:41.736035 | orchestrator | 2025-09-23 18:43:41.736048 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2025-09-23 18:43:42.079589 | orchestrator | ok: [testbed-manager] 2025-09-23 18:43:42.079696 | orchestrator | 2025-09-23 18:43:42.079715 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2025-09-23 18:43:42.203891 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:43:42.203983 | orchestrator | 2025-09-23 18:43:42.203998 | orchestrator | PLAY [Apply role traefik] ****************************************************** 2025-09-23 18:43:42.204010 | orchestrator | 2025-09-23 18:43:42.204023 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:43:44.985687 | orchestrator | ok: [testbed-manager] 2025-09-23 18:43:44.985796 | orchestrator | 2025-09-23 18:43:44.985813 | orchestrator | TASK [Apply traefik role] ****************************************************** 2025-09-23 18:43:45.113095 | orchestrator | included: osism.services.traefik for testbed-manager 2025-09-23 18:43:45.113183 | orchestrator | 2025-09-23 18:43:45.113197 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2025-09-23 18:43:45.184893 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2025-09-23 18:43:45.184974 | orchestrator | 2025-09-23 18:43:45.184988 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2025-09-23 18:43:46.299624 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2025-09-23 18:43:46.299746 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2025-09-23 18:43:46.299762 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2025-09-23 18:43:46.299774 | orchestrator | 2025-09-23 18:43:46.299800 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2025-09-23 18:43:48.092991 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2025-09-23 18:43:48.093091 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2025-09-23 18:43:48.093110 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2025-09-23 18:43:48.093123 | orchestrator | 2025-09-23 18:43:48.093135 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2025-09-23 18:43:48.785630 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-23 18:43:48.785721 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:48.785736 | orchestrator | 2025-09-23 18:43:48.785750 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2025-09-23 18:43:49.412018 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-23 18:43:49.412112 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:49.412127 | orchestrator | 2025-09-23 18:43:49.412140 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2025-09-23 18:43:49.465093 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:43:49.465169 | orchestrator | 2025-09-23 18:43:49.465184 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2025-09-23 18:43:49.831981 | orchestrator | ok: [testbed-manager] 2025-09-23 18:43:49.832084 | orchestrator | 2025-09-23 18:43:49.832110 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2025-09-23 18:43:49.916875 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2025-09-23 18:43:49.916976 | orchestrator | 2025-09-23 18:43:49.917000 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2025-09-23 18:43:50.929282 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:50.929441 | orchestrator | 2025-09-23 18:43:50.929462 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2025-09-23 18:43:51.736630 | orchestrator | changed: [testbed-manager] 2025-09-23 18:43:51.736730 | orchestrator | 2025-09-23 18:43:51.736747 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2025-09-23 18:44:03.877299 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:03.877436 | orchestrator | 2025-09-23 18:44:03.877456 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2025-09-23 18:44:03.926467 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:44:03.926539 | orchestrator | 2025-09-23 18:44:03.926554 | orchestrator | PLAY [Deploy manager service] ************************************************** 2025-09-23 18:44:03.926566 | orchestrator | 2025-09-23 18:44:03.926578 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:44:05.642273 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:05.642363 | orchestrator | 2025-09-23 18:44:05.642444 | orchestrator | TASK [Apply manager role] ****************************************************** 2025-09-23 18:44:05.745872 | orchestrator | included: osism.services.manager for testbed-manager 2025-09-23 18:44:05.745941 | orchestrator | 2025-09-23 18:44:05.745953 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2025-09-23 18:44:05.793021 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2025-09-23 18:44:05.793089 | orchestrator | 2025-09-23 18:44:05.793103 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2025-09-23 18:44:08.060819 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:08.060956 | orchestrator | 2025-09-23 18:44:08.060974 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2025-09-23 18:44:08.100933 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:08.101006 | orchestrator | 2025-09-23 18:44:08.101022 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2025-09-23 18:44:08.203232 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2025-09-23 18:44:08.203303 | orchestrator | 2025-09-23 18:44:08.203317 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2025-09-23 18:44:10.867976 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2025-09-23 18:44:10.868065 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2025-09-23 18:44:10.868081 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2025-09-23 18:44:10.868093 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2025-09-23 18:44:10.868104 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2025-09-23 18:44:10.868115 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2025-09-23 18:44:10.868126 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2025-09-23 18:44:10.868137 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2025-09-23 18:44:10.868149 | orchestrator | 2025-09-23 18:44:10.868161 | orchestrator | TASK [osism.services.manager : Copy all environment file] ********************** 2025-09-23 18:44:11.480643 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:11.480707 | orchestrator | 2025-09-23 18:44:11.480716 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2025-09-23 18:44:12.055458 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:12.055547 | orchestrator | 2025-09-23 18:44:12.055564 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2025-09-23 18:44:12.131755 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2025-09-23 18:44:12.131822 | orchestrator | 2025-09-23 18:44:12.131835 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2025-09-23 18:44:13.389690 | orchestrator | changed: [testbed-manager] => (item=ara) 2025-09-23 18:44:13.389811 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2025-09-23 18:44:13.389827 | orchestrator | 2025-09-23 18:44:13.389840 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2025-09-23 18:44:14.155843 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:14.155944 | orchestrator | 2025-09-23 18:44:14.155961 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2025-09-23 18:44:14.204008 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:44:14.204101 | orchestrator | 2025-09-23 18:44:14.204115 | orchestrator | TASK [osism.services.manager : Include frontend config tasks] ****************** 2025-09-23 18:44:14.289763 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-frontend.yml for testbed-manager 2025-09-23 18:44:14.289851 | orchestrator | 2025-09-23 18:44:14.289865 | orchestrator | TASK [osism.services.manager : Copy frontend environment file] ***************** 2025-09-23 18:44:14.886090 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:14.886177 | orchestrator | 2025-09-23 18:44:14.886192 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2025-09-23 18:44:14.950159 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2025-09-23 18:44:14.950255 | orchestrator | 2025-09-23 18:44:14.950269 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2025-09-23 18:44:16.215105 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-23 18:44:16.215200 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-23 18:44:16.215216 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:16.215230 | orchestrator | 2025-09-23 18:44:16.215242 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2025-09-23 18:44:16.806875 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:16.806960 | orchestrator | 2025-09-23 18:44:16.806977 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2025-09-23 18:44:16.865454 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:44:16.865525 | orchestrator | 2025-09-23 18:44:16.865539 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2025-09-23 18:44:16.935298 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2025-09-23 18:44:16.935402 | orchestrator | 2025-09-23 18:44:16.935419 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2025-09-23 18:44:17.391672 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:17.391758 | orchestrator | 2025-09-23 18:44:17.391773 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2025-09-23 18:44:17.755501 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:17.755584 | orchestrator | 2025-09-23 18:44:17.755600 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2025-09-23 18:44:18.885555 | orchestrator | changed: [testbed-manager] => (item=conductor) 2025-09-23 18:44:18.885666 | orchestrator | changed: [testbed-manager] => (item=openstack) 2025-09-23 18:44:18.885690 | orchestrator | 2025-09-23 18:44:18.885711 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2025-09-23 18:44:19.472091 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:19.472205 | orchestrator | 2025-09-23 18:44:19.472222 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2025-09-23 18:44:19.801353 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:19.801479 | orchestrator | 2025-09-23 18:44:19.801496 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2025-09-23 18:44:20.134799 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:20.134884 | orchestrator | 2025-09-23 18:44:20.134900 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2025-09-23 18:44:20.182414 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:44:20.182482 | orchestrator | 2025-09-23 18:44:20.182496 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2025-09-23 18:44:20.254625 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2025-09-23 18:44:20.254694 | orchestrator | 2025-09-23 18:44:20.254707 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2025-09-23 18:44:20.293567 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:20.293622 | orchestrator | 2025-09-23 18:44:20.293636 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2025-09-23 18:44:22.190830 | orchestrator | changed: [testbed-manager] => (item=osism) 2025-09-23 18:44:22.190933 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2025-09-23 18:44:22.190949 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2025-09-23 18:44:22.190961 | orchestrator | 2025-09-23 18:44:22.190974 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2025-09-23 18:44:22.919623 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:22.919724 | orchestrator | 2025-09-23 18:44:22.919743 | orchestrator | TASK [osism.services.manager : Copy hubble wrapper script] ********************* 2025-09-23 18:44:23.620077 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:23.620160 | orchestrator | 2025-09-23 18:44:23.620180 | orchestrator | TASK [osism.services.manager : Copy flux wrapper script] *********************** 2025-09-23 18:44:24.329801 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:24.329918 | orchestrator | 2025-09-23 18:44:24.329934 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2025-09-23 18:44:24.400597 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2025-09-23 18:44:24.400693 | orchestrator | 2025-09-23 18:44:24.400710 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2025-09-23 18:44:24.443072 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:24.443158 | orchestrator | 2025-09-23 18:44:24.443173 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2025-09-23 18:44:25.150602 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2025-09-23 18:44:25.150705 | orchestrator | 2025-09-23 18:44:25.150721 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2025-09-23 18:44:25.241544 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2025-09-23 18:44:25.241630 | orchestrator | 2025-09-23 18:44:25.241643 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2025-09-23 18:44:25.961201 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:25.961299 | orchestrator | 2025-09-23 18:44:25.961316 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2025-09-23 18:44:26.570667 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:26.570758 | orchestrator | 2025-09-23 18:44:26.570774 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2025-09-23 18:44:26.622176 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:44:26.622219 | orchestrator | 2025-09-23 18:44:26.622232 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2025-09-23 18:44:26.682471 | orchestrator | ok: [testbed-manager] 2025-09-23 18:44:26.682504 | orchestrator | 2025-09-23 18:44:26.682516 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2025-09-23 18:44:27.538595 | orchestrator | changed: [testbed-manager] 2025-09-23 18:44:27.538688 | orchestrator | 2025-09-23 18:44:27.538704 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2025-09-23 18:45:31.576158 | orchestrator | changed: [testbed-manager] 2025-09-23 18:45:31.576255 | orchestrator | 2025-09-23 18:45:31.576273 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2025-09-23 18:45:32.460149 | orchestrator | ok: [testbed-manager] 2025-09-23 18:45:32.460239 | orchestrator | 2025-09-23 18:45:32.460255 | orchestrator | TASK [osism.services.manager : Do a manual start of the manager service] ******* 2025-09-23 18:45:32.569545 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:45:32.569630 | orchestrator | 2025-09-23 18:45:32.569647 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2025-09-23 18:45:35.286490 | orchestrator | changed: [testbed-manager] 2025-09-23 18:45:35.286572 | orchestrator | 2025-09-23 18:45:35.286588 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2025-09-23 18:45:35.340264 | orchestrator | ok: [testbed-manager] 2025-09-23 18:45:35.340342 | orchestrator | 2025-09-23 18:45:35.340386 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-09-23 18:45:35.340399 | orchestrator | 2025-09-23 18:45:35.340410 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2025-09-23 18:45:35.391033 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:45:35.391125 | orchestrator | 2025-09-23 18:45:35.391142 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2025-09-23 18:46:35.440589 | orchestrator | Pausing for 60 seconds 2025-09-23 18:46:35.440720 | orchestrator | changed: [testbed-manager] 2025-09-23 18:46:35.440742 | orchestrator | 2025-09-23 18:46:35.440756 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2025-09-23 18:46:39.601029 | orchestrator | changed: [testbed-manager] 2025-09-23 18:46:39.601123 | orchestrator | 2025-09-23 18:46:39.601142 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2025-09-23 18:47:21.163960 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2025-09-23 18:47:21.164072 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2025-09-23 18:47:21.164090 | orchestrator | changed: [testbed-manager] 2025-09-23 18:47:21.164136 | orchestrator | 2025-09-23 18:47:21.164150 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2025-09-23 18:47:31.199574 | orchestrator | changed: [testbed-manager] 2025-09-23 18:47:31.199680 | orchestrator | 2025-09-23 18:47:31.199693 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2025-09-23 18:47:31.285834 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2025-09-23 18:47:31.285903 | orchestrator | 2025-09-23 18:47:31.285914 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-09-23 18:47:31.285924 | orchestrator | 2025-09-23 18:47:31.285932 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2025-09-23 18:47:31.330793 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:47:31.330846 | orchestrator | 2025-09-23 18:47:31.330856 | orchestrator | TASK [osism.services.manager : Include version verification tasks] ************* 2025-09-23 18:47:31.397495 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/verify-versions.yml for testbed-manager 2025-09-23 18:47:31.397551 | orchestrator | 2025-09-23 18:47:31.397561 | orchestrator | TASK [osism.services.manager : Deploy service manager version check script] **** 2025-09-23 18:47:32.175411 | orchestrator | changed: [testbed-manager] 2025-09-23 18:47:32.175522 | orchestrator | 2025-09-23 18:47:32.175540 | orchestrator | TASK [osism.services.manager : Execute service manager version check] ********** 2025-09-23 18:47:36.211349 | orchestrator | ok: [testbed-manager] 2025-09-23 18:47:36.211444 | orchestrator | 2025-09-23 18:47:36.211458 | orchestrator | TASK [osism.services.manager : Display version check results] ****************** 2025-09-23 18:47:36.285625 | orchestrator | ok: [testbed-manager] => { 2025-09-23 18:47:36.285718 | orchestrator | "version_check_result.stdout_lines": [ 2025-09-23 18:47:36.285728 | orchestrator | "=== OSISM Container Version Check ===", 2025-09-23 18:47:36.285736 | orchestrator | "Checking running containers against expected versions...", 2025-09-23 18:47:36.285743 | orchestrator | "", 2025-09-23 18:47:36.285750 | orchestrator | "Checking service: inventory_reconciler (Inventory Reconciler Service)", 2025-09-23 18:47:36.285757 | orchestrator | " Expected: registry.osism.tech/osism/inventory-reconciler:latest", 2025-09-23 18:47:36.285764 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.285771 | orchestrator | " Running: registry.osism.tech/osism/inventory-reconciler:latest", 2025-09-23 18:47:36.285786 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286554 | orchestrator | "", 2025-09-23 18:47:36.286583 | orchestrator | "Checking service: osism-ansible (OSISM Ansible Service)", 2025-09-23 18:47:36.286593 | orchestrator | " Expected: registry.osism.tech/osism/osism-ansible:latest", 2025-09-23 18:47:36.286602 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.286611 | orchestrator | " Running: registry.osism.tech/osism/osism-ansible:latest", 2025-09-23 18:47:36.286620 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286629 | orchestrator | "", 2025-09-23 18:47:36.286639 | orchestrator | "Checking service: osism-kubernetes (Osism-Kubernetes Service)", 2025-09-23 18:47:36.286649 | orchestrator | " Expected: registry.osism.tech/osism/osism-kubernetes:latest", 2025-09-23 18:47:36.286659 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.286668 | orchestrator | " Running: registry.osism.tech/osism/osism-kubernetes:latest", 2025-09-23 18:47:36.286679 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286688 | orchestrator | "", 2025-09-23 18:47:36.286697 | orchestrator | "Checking service: ceph-ansible (Ceph-Ansible Service)", 2025-09-23 18:47:36.286706 | orchestrator | " Expected: registry.osism.tech/osism/ceph-ansible:reef", 2025-09-23 18:47:36.286717 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.286727 | orchestrator | " Running: registry.osism.tech/osism/ceph-ansible:reef", 2025-09-23 18:47:36.286737 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286748 | orchestrator | "", 2025-09-23 18:47:36.286758 | orchestrator | "Checking service: kolla-ansible (Kolla-Ansible Service)", 2025-09-23 18:47:36.286768 | orchestrator | " Expected: registry.osism.tech/osism/kolla-ansible:2024.2", 2025-09-23 18:47:36.286808 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.286818 | orchestrator | " Running: registry.osism.tech/osism/kolla-ansible:2024.2", 2025-09-23 18:47:36.286828 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286839 | orchestrator | "", 2025-09-23 18:47:36.286849 | orchestrator | "Checking service: osismclient (OSISM Client)", 2025-09-23 18:47:36.286859 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.286871 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.286881 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.286891 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286901 | orchestrator | "", 2025-09-23 18:47:36.286911 | orchestrator | "Checking service: ara-server (ARA Server)", 2025-09-23 18:47:36.286921 | orchestrator | " Expected: registry.osism.tech/osism/ara-server:1.7.3", 2025-09-23 18:47:36.286931 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.286942 | orchestrator | " Running: registry.osism.tech/osism/ara-server:1.7.3", 2025-09-23 18:47:36.286951 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.286960 | orchestrator | "", 2025-09-23 18:47:36.286970 | orchestrator | "Checking service: mariadb (MariaDB for ARA)", 2025-09-23 18:47:36.286989 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/mariadb:11.8.3", 2025-09-23 18:47:36.286999 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287009 | orchestrator | " Running: registry.osism.tech/dockerhub/library/mariadb:11.8.3", 2025-09-23 18:47:36.287018 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287028 | orchestrator | "", 2025-09-23 18:47:36.287038 | orchestrator | "Checking service: frontend (OSISM Frontend)", 2025-09-23 18:47:36.287049 | orchestrator | " Expected: registry.osism.tech/osism/osism-frontend:latest", 2025-09-23 18:47:36.287059 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287069 | orchestrator | " Running: registry.osism.tech/osism/osism-frontend:latest", 2025-09-23 18:47:36.287077 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287087 | orchestrator | "", 2025-09-23 18:47:36.287097 | orchestrator | "Checking service: redis (Redis Cache)", 2025-09-23 18:47:36.287107 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/redis:7.4.5-alpine", 2025-09-23 18:47:36.287117 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287126 | orchestrator | " Running: registry.osism.tech/dockerhub/library/redis:7.4.5-alpine", 2025-09-23 18:47:36.287136 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287145 | orchestrator | "", 2025-09-23 18:47:36.287155 | orchestrator | "Checking service: api (OSISM API Service)", 2025-09-23 18:47:36.287165 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287175 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287185 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287194 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287205 | orchestrator | "", 2025-09-23 18:47:36.287215 | orchestrator | "Checking service: listener (OpenStack Event Listener)", 2025-09-23 18:47:36.287224 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287233 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287244 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287254 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287264 | orchestrator | "", 2025-09-23 18:47:36.287273 | orchestrator | "Checking service: openstack (OpenStack Integration)", 2025-09-23 18:47:36.287283 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287314 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287324 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287334 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287344 | orchestrator | "", 2025-09-23 18:47:36.287353 | orchestrator | "Checking service: beat (Celery Beat Scheduler)", 2025-09-23 18:47:36.287363 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287373 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287383 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287402 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287411 | orchestrator | "", 2025-09-23 18:47:36.287420 | orchestrator | "Checking service: flower (Celery Flower Monitor)", 2025-09-23 18:47:36.287449 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287459 | orchestrator | " Enabled: true", 2025-09-23 18:47:36.287469 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-23 18:47:36.287478 | orchestrator | " Status: ✅ MATCH", 2025-09-23 18:47:36.287489 | orchestrator | "", 2025-09-23 18:47:36.287500 | orchestrator | "=== Summary ===", 2025-09-23 18:47:36.287509 | orchestrator | "Errors (version mismatches): 0", 2025-09-23 18:47:36.287519 | orchestrator | "Warnings (expected containers not running): 0", 2025-09-23 18:47:36.287530 | orchestrator | "", 2025-09-23 18:47:36.287540 | orchestrator | "✅ All running containers match expected versions!" 2025-09-23 18:47:36.287550 | orchestrator | ] 2025-09-23 18:47:36.287561 | orchestrator | } 2025-09-23 18:47:36.287571 | orchestrator | 2025-09-23 18:47:36.287582 | orchestrator | TASK [osism.services.manager : Skip version check due to service configuration] *** 2025-09-23 18:47:36.334806 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:47:36.334894 | orchestrator | 2025-09-23 18:47:36.334911 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:47:36.334928 | orchestrator | testbed-manager : ok=70 changed=37 unreachable=0 failed=0 skipped=13 rescued=0 ignored=0 2025-09-23 18:47:36.334939 | orchestrator | 2025-09-23 18:47:36.452619 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-09-23 18:47:36.452708 | orchestrator | + deactivate 2025-09-23 18:47:36.452747 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2025-09-23 18:47:36.452756 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-09-23 18:47:36.452762 | orchestrator | + export PATH 2025-09-23 18:47:36.452777 | orchestrator | + unset _OLD_VIRTUAL_PATH 2025-09-23 18:47:36.452784 | orchestrator | + '[' -n '' ']' 2025-09-23 18:47:36.452789 | orchestrator | + hash -r 2025-09-23 18:47:36.452964 | orchestrator | + '[' -n '' ']' 2025-09-23 18:47:36.453046 | orchestrator | + unset VIRTUAL_ENV 2025-09-23 18:47:36.453065 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2025-09-23 18:47:36.453086 | orchestrator | + '[' '!' '' = nondestructive ']' 2025-09-23 18:47:36.453106 | orchestrator | + unset -f deactivate 2025-09-23 18:47:36.453125 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2025-09-23 18:47:36.462169 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-09-23 18:47:36.462222 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-09-23 18:47:36.462235 | orchestrator | + local max_attempts=60 2025-09-23 18:47:36.462247 | orchestrator | + local name=ceph-ansible 2025-09-23 18:47:36.462258 | orchestrator | + local attempt_num=1 2025-09-23 18:47:36.462502 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 18:47:36.492048 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-23 18:47:36.492163 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-09-23 18:47:36.492191 | orchestrator | + local max_attempts=60 2025-09-23 18:47:36.492212 | orchestrator | + local name=kolla-ansible 2025-09-23 18:47:36.492228 | orchestrator | + local attempt_num=1 2025-09-23 18:47:36.492537 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-09-23 18:47:36.525383 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-23 18:47:36.525450 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-09-23 18:47:36.525463 | orchestrator | + local max_attempts=60 2025-09-23 18:47:36.525475 | orchestrator | + local name=osism-ansible 2025-09-23 18:47:36.525486 | orchestrator | + local attempt_num=1 2025-09-23 18:47:36.525994 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-09-23 18:47:36.558417 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-23 18:47:36.558498 | orchestrator | + [[ true == \t\r\u\e ]] 2025-09-23 18:47:36.558515 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-09-23 18:47:37.353180 | orchestrator | + docker compose --project-directory /opt/manager ps 2025-09-23 18:47:37.548811 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2025-09-23 18:47:37.548941 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:reef "/entrypoint.sh osis…" ceph-ansible 2 minutes ago Up About a minute (healthy) 2025-09-23 18:47:37.548957 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:2024.2 "/entrypoint.sh osis…" kolla-ansible 2 minutes ago Up About a minute (healthy) 2025-09-23 18:47:37.548969 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" api 2 minutes ago Up 2 minutes (healthy) 192.168.16.5:8000->8000/tcp 2025-09-23 18:47:37.548982 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.3 "sh -c '/wait && /ru…" ara-server 2 minutes ago Up About a minute (healthy) 8000/tcp 2025-09-23 18:47:37.548994 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" beat 2 minutes ago Up 2 minutes (healthy) 2025-09-23 18:47:37.549004 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" flower 2 minutes ago Up 2 minutes (healthy) 2025-09-23 18:47:37.549033 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:latest "/sbin/tini -- /entr…" inventory_reconciler 2 minutes ago Up 58 seconds (healthy) 2025-09-23 18:47:37.549044 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" listener 2 minutes ago Up 2 minutes (healthy) 2025-09-23 18:47:37.549055 | orchestrator | manager-mariadb-1 registry.osism.tech/dockerhub/library/mariadb:11.8.3 "docker-entrypoint.s…" mariadb 2 minutes ago Up 2 minutes (healthy) 3306/tcp 2025-09-23 18:47:37.549066 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" openstack 2 minutes ago Up 2 minutes (healthy) 2025-09-23 18:47:37.549077 | orchestrator | manager-redis-1 registry.osism.tech/dockerhub/library/redis:7.4.5-alpine "docker-entrypoint.s…" redis 2 minutes ago Up 2 minutes (healthy) 6379/tcp 2025-09-23 18:47:37.549088 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:latest "/entrypoint.sh osis…" osism-ansible 2 minutes ago Up About a minute (healthy) 2025-09-23 18:47:37.549099 | orchestrator | osism-frontend registry.osism.tech/osism/osism-frontend:latest "docker-entrypoint.s…" frontend 2 minutes ago Up 2 minutes 192.168.16.5:3000->3000/tcp 2025-09-23 18:47:37.549110 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:latest "/entrypoint.sh osis…" osism-kubernetes 2 minutes ago Up About a minute (healthy) 2025-09-23 18:47:37.549121 | orchestrator | osismclient registry.osism.tech/osism/osism:latest "/sbin/tini -- sleep…" osismclient 2 minutes ago Up 2 minutes (healthy) 2025-09-23 18:47:37.557933 | orchestrator | ++ semver latest 7.0.0 2025-09-23 18:47:37.625341 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-23 18:47:37.625426 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-23 18:47:37.625441 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2025-09-23 18:47:37.626744 | orchestrator | + osism apply resolvconf -l testbed-manager 2025-09-23 18:47:49.930723 | orchestrator | 2025-09-23 18:47:49 | INFO  | Task 8eb45136-cc54-4d47-85f1-bb3d5cf00232 (resolvconf) was prepared for execution. 2025-09-23 18:47:49.930843 | orchestrator | 2025-09-23 18:47:49 | INFO  | It takes a moment until task 8eb45136-cc54-4d47-85f1-bb3d5cf00232 (resolvconf) has been started and output is visible here. 2025-09-23 18:48:05.968162 | orchestrator | 2025-09-23 18:48:05.968295 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2025-09-23 18:48:05.968307 | orchestrator | 2025-09-23 18:48:05.968315 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:48:05.968351 | orchestrator | Tuesday 23 September 2025 18:47:53 +0000 (0:00:00.160) 0:00:00.160 ***** 2025-09-23 18:48:05.968359 | orchestrator | ok: [testbed-manager] 2025-09-23 18:48:05.968368 | orchestrator | 2025-09-23 18:48:05.968374 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-09-23 18:48:05.968382 | orchestrator | Tuesday 23 September 2025 18:47:57 +0000 (0:00:03.908) 0:00:04.069 ***** 2025-09-23 18:48:05.968389 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:48:05.968396 | orchestrator | 2025-09-23 18:48:05.968402 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-09-23 18:48:05.968409 | orchestrator | Tuesday 23 September 2025 18:47:57 +0000 (0:00:00.068) 0:00:04.138 ***** 2025-09-23 18:48:05.968415 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2025-09-23 18:48:05.968423 | orchestrator | 2025-09-23 18:48:05.968429 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-09-23 18:48:05.968435 | orchestrator | Tuesday 23 September 2025 18:47:58 +0000 (0:00:00.101) 0:00:04.240 ***** 2025-09-23 18:48:05.968449 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2025-09-23 18:48:05.968456 | orchestrator | 2025-09-23 18:48:05.968463 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-09-23 18:48:05.968469 | orchestrator | Tuesday 23 September 2025 18:47:58 +0000 (0:00:00.082) 0:00:04.323 ***** 2025-09-23 18:48:05.968475 | orchestrator | ok: [testbed-manager] 2025-09-23 18:48:05.968481 | orchestrator | 2025-09-23 18:48:05.968488 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-09-23 18:48:05.968494 | orchestrator | Tuesday 23 September 2025 18:47:59 +0000 (0:00:01.139) 0:00:05.462 ***** 2025-09-23 18:48:05.968500 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:48:05.968507 | orchestrator | 2025-09-23 18:48:05.968513 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-09-23 18:48:05.968520 | orchestrator | Tuesday 23 September 2025 18:47:59 +0000 (0:00:00.065) 0:00:05.528 ***** 2025-09-23 18:48:05.968526 | orchestrator | ok: [testbed-manager] 2025-09-23 18:48:05.968532 | orchestrator | 2025-09-23 18:48:05.968538 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-09-23 18:48:05.968544 | orchestrator | Tuesday 23 September 2025 18:48:00 +0000 (0:00:01.509) 0:00:07.037 ***** 2025-09-23 18:48:05.968551 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:48:05.968557 | orchestrator | 2025-09-23 18:48:05.968563 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-09-23 18:48:05.968571 | orchestrator | Tuesday 23 September 2025 18:48:00 +0000 (0:00:00.076) 0:00:07.113 ***** 2025-09-23 18:48:05.968577 | orchestrator | changed: [testbed-manager] 2025-09-23 18:48:05.968583 | orchestrator | 2025-09-23 18:48:05.968589 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-09-23 18:48:05.968595 | orchestrator | Tuesday 23 September 2025 18:48:01 +0000 (0:00:00.522) 0:00:07.636 ***** 2025-09-23 18:48:05.968602 | orchestrator | changed: [testbed-manager] 2025-09-23 18:48:05.968608 | orchestrator | 2025-09-23 18:48:05.968614 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-09-23 18:48:05.968620 | orchestrator | Tuesday 23 September 2025 18:48:03 +0000 (0:00:02.118) 0:00:09.754 ***** 2025-09-23 18:48:05.968626 | orchestrator | ok: [testbed-manager] 2025-09-23 18:48:05.968633 | orchestrator | 2025-09-23 18:48:05.968639 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-09-23 18:48:05.968645 | orchestrator | Tuesday 23 September 2025 18:48:04 +0000 (0:00:00.978) 0:00:10.733 ***** 2025-09-23 18:48:05.968669 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2025-09-23 18:48:05.968675 | orchestrator | 2025-09-23 18:48:05.968681 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-09-23 18:48:05.968688 | orchestrator | Tuesday 23 September 2025 18:48:04 +0000 (0:00:00.096) 0:00:10.830 ***** 2025-09-23 18:48:05.968694 | orchestrator | changed: [testbed-manager] 2025-09-23 18:48:05.968700 | orchestrator | 2025-09-23 18:48:05.968706 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:48:05.968714 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-23 18:48:05.968720 | orchestrator | 2025-09-23 18:48:05.968726 | orchestrator | 2025-09-23 18:48:05.968733 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 18:48:05.968739 | orchestrator | Tuesday 23 September 2025 18:48:05 +0000 (0:00:01.109) 0:00:11.940 ***** 2025-09-23 18:48:05.968745 | orchestrator | =============================================================================== 2025-09-23 18:48:05.968751 | orchestrator | Gathering Facts --------------------------------------------------------- 3.91s 2025-09-23 18:48:05.968757 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 2.12s 2025-09-23 18:48:05.968763 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 1.51s 2025-09-23 18:48:05.968770 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 1.14s 2025-09-23 18:48:05.968776 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.11s 2025-09-23 18:48:05.968782 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 0.98s 2025-09-23 18:48:05.968802 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.52s 2025-09-23 18:48:05.968808 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.10s 2025-09-23 18:48:05.968815 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.10s 2025-09-23 18:48:05.968821 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.08s 2025-09-23 18:48:05.968827 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.08s 2025-09-23 18:48:05.968833 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.07s 2025-09-23 18:48:05.968840 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.07s 2025-09-23 18:48:06.318621 | orchestrator | + osism apply sshconfig 2025-09-23 18:48:18.454373 | orchestrator | 2025-09-23 18:48:18 | INFO  | Task 23e7cd25-94e9-47e9-b4ad-be4ea8bfbde5 (sshconfig) was prepared for execution. 2025-09-23 18:48:18.454490 | orchestrator | 2025-09-23 18:48:18 | INFO  | It takes a moment until task 23e7cd25-94e9-47e9-b4ad-be4ea8bfbde5 (sshconfig) has been started and output is visible here. 2025-09-23 18:48:30.487387 | orchestrator | 2025-09-23 18:48:30.487510 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2025-09-23 18:48:30.487527 | orchestrator | 2025-09-23 18:48:30.487539 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2025-09-23 18:48:30.487551 | orchestrator | Tuesday 23 September 2025 18:48:22 +0000 (0:00:00.169) 0:00:00.169 ***** 2025-09-23 18:48:30.487562 | orchestrator | ok: [testbed-manager] 2025-09-23 18:48:30.487574 | orchestrator | 2025-09-23 18:48:30.487585 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2025-09-23 18:48:30.487596 | orchestrator | Tuesday 23 September 2025 18:48:23 +0000 (0:00:00.547) 0:00:00.716 ***** 2025-09-23 18:48:30.487607 | orchestrator | changed: [testbed-manager] 2025-09-23 18:48:30.487618 | orchestrator | 2025-09-23 18:48:30.487629 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2025-09-23 18:48:30.487639 | orchestrator | Tuesday 23 September 2025 18:48:23 +0000 (0:00:00.526) 0:00:01.243 ***** 2025-09-23 18:48:30.487677 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2025-09-23 18:48:30.487689 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2025-09-23 18:48:30.487700 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2025-09-23 18:48:30.487710 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2025-09-23 18:48:30.487721 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2025-09-23 18:48:30.487732 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2025-09-23 18:48:30.487743 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2025-09-23 18:48:30.487753 | orchestrator | 2025-09-23 18:48:30.487764 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2025-09-23 18:48:30.487775 | orchestrator | Tuesday 23 September 2025 18:48:29 +0000 (0:00:05.921) 0:00:07.165 ***** 2025-09-23 18:48:30.487785 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:48:30.487796 | orchestrator | 2025-09-23 18:48:30.487806 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2025-09-23 18:48:30.487817 | orchestrator | Tuesday 23 September 2025 18:48:29 +0000 (0:00:00.065) 0:00:07.230 ***** 2025-09-23 18:48:30.487828 | orchestrator | changed: [testbed-manager] 2025-09-23 18:48:30.487838 | orchestrator | 2025-09-23 18:48:30.487849 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:48:30.487861 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 18:48:30.487872 | orchestrator | 2025-09-23 18:48:30.487883 | orchestrator | 2025-09-23 18:48:30.487894 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 18:48:30.487905 | orchestrator | Tuesday 23 September 2025 18:48:30 +0000 (0:00:00.608) 0:00:07.839 ***** 2025-09-23 18:48:30.487916 | orchestrator | =============================================================================== 2025-09-23 18:48:30.487926 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 5.92s 2025-09-23 18:48:30.487937 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.61s 2025-09-23 18:48:30.487947 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.55s 2025-09-23 18:48:30.487958 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.53s 2025-09-23 18:48:30.487968 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.07s 2025-09-23 18:48:30.860985 | orchestrator | + osism apply known-hosts 2025-09-23 18:48:42.929345 | orchestrator | 2025-09-23 18:48:42 | INFO  | Task ad3fc4e7-9a90-46d9-8e0e-43d21799cba9 (known-hosts) was prepared for execution. 2025-09-23 18:48:42.929453 | orchestrator | 2025-09-23 18:48:42 | INFO  | It takes a moment until task ad3fc4e7-9a90-46d9-8e0e-43d21799cba9 (known-hosts) has been started and output is visible here. 2025-09-23 18:49:00.436946 | orchestrator | 2025-09-23 18:49:00.437079 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2025-09-23 18:49:00.437095 | orchestrator | 2025-09-23 18:49:00.437105 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2025-09-23 18:49:00.437115 | orchestrator | Tuesday 23 September 2025 18:48:47 +0000 (0:00:00.186) 0:00:00.186 ***** 2025-09-23 18:49:00.437130 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-09-23 18:49:00.437142 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-09-23 18:49:00.437210 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-09-23 18:49:00.437222 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-09-23 18:49:00.437272 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-09-23 18:49:00.437283 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-09-23 18:49:00.437292 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-09-23 18:49:00.437300 | orchestrator | 2025-09-23 18:49:00.437309 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2025-09-23 18:49:00.437339 | orchestrator | Tuesday 23 September 2025 18:48:53 +0000 (0:00:06.180) 0:00:06.367 ***** 2025-09-23 18:49:00.437359 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-09-23 18:49:00.437371 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-09-23 18:49:00.437380 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-09-23 18:49:00.437389 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-09-23 18:49:00.437397 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-09-23 18:49:00.437411 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-09-23 18:49:00.437424 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-09-23 18:49:00.437433 | orchestrator | 2025-09-23 18:49:00.437442 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:00.437450 | orchestrator | Tuesday 23 September 2025 18:48:53 +0000 (0:00:00.186) 0:00:06.554 ***** 2025-09-23 18:49:00.437460 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOOgZTg1dlJZFlm4ubjv8rnb4AxiwUT2NfTjw0mccWfVge+ZmYx7YGqGViolHzLSFagzxsbVuYIhuhqc4iN5Lt8=) 2025-09-23 18:49:00.437473 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCoZwQ+DGtMEXh5nRCfbK579GbPt6js6LiQJAbw07NIg5VfeP8iScxTiDfMAXcMzfQO2CmUJ0RCHs1ivv5+UrtlT9GQL+DpukeRLrxD3FVIpDnikh8oAaUTkXa/bO2AvOCcU4pu08wNDv4JXSu7PVlDP60Yg4Mz9gNDZyLS2uGMNsM9cpOTJWw6+ejD1bamk9zyeB3eTNmRAqsCfLYyGQYOoqRnsWb+5c7qp8QIn+CnV5qCSwm6SXLED9sHK5H2BdWP2LJqxCqqdJE3pPzVUsa2W25w7iiPiwm3AaIhY9DLfIt0t29J/suQEjLLMVVrSjYmIdNThYMBs2tOV8wF3LqTGXjCp+MpEof9xgLJZCtD0jz83wjIbz0lpGZMVqYnXijPqv7HOpEvufG/NUKtmUwLMF2Xt070GtUqgqMc/nLEOUSiaUMNDAk/GDv1QLhLSja57LeZwKPdWeTojcUhIU0aS8JXHiJ2togh69ySniDoxKGybnCx3W4JYosBrQe5JgM=) 2025-09-23 18:49:00.437487 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMD4aDXYTVaoTeZ3g71u2nSTmyvByhrWJDLzVv5Pn5mm) 2025-09-23 18:49:00.437498 | orchestrator | 2025-09-23 18:49:00.437507 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:00.437517 | orchestrator | Tuesday 23 September 2025 18:48:54 +0000 (0:00:01.260) 0:00:07.814 ***** 2025-09-23 18:49:00.437549 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0Af0hFBojIjkjLpN6BU9psmJ5nIY22u1nd0zEGg0+hil6o738WQ0RKSeXXI7vPbmq3mfUG6hfBVu/mX2V2TqOgrsdFa8+1pdNm5nM6YXbIGRPm8vFcwIh8Ytq+907PrvGiA/6C/h894011yDgXKBtPXjH/vTYXnvx1JwfUr0yqU6u5IGzxJyMsB/bA0UBXOIqDXD+2WETsqQh+vJID9/90oWUnlvhdb3uxqofu7vGzIZziE3IyyA3qfV7tvjYHnU5G5YkxUnB0AwzymlJ4IRnP14oC+6hyfIk6wYwmSH7J2Mg98HKyfaSMyHv+Ncisy17QsBoP6rgV9hgiZum5k1MCqFSPfWgxxtui6AUlVVgUNmFvwciuIFnlGsyvkierMBaYyVbDTilg2td4g6X8cldvv49MiwCZNnmdlfdNqc5Swa5YaC6RdAVOsu5ABqCMCENG4OL3XbH5jHteZgHg/TamgG8mIFSvcGLIkhXP6SVRs09kE4bcbUTl0DdsRNI2x8=) 2025-09-23 18:49:00.437561 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI8RzyMNXFR/RPqn7RmV/DItars9Mhb9ngDDiJ+QNpOcOLsE/4JbY/AsJRvBcNxpCziIjXPTQ2w1nA0BrgtS9lc=) 2025-09-23 18:49:00.437578 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAxUYylwluJag646xt76v+2upawbbqJGSyNyHAhR9YY9) 2025-09-23 18:49:00.437588 | orchestrator | 2025-09-23 18:49:00.437598 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:00.437608 | orchestrator | Tuesday 23 September 2025 18:48:55 +0000 (0:00:01.072) 0:00:08.887 ***** 2025-09-23 18:49:00.437618 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2SVXKdAkYZIewLUkHzkKKWIsBuf6mF4htSl/FJQqgPPrCDLa09kDHrQo+3gPvse03OZLb4Gyfr/hScpkNsI4Tqi/sWmU8hAUfTvsUoaZpkKqaHULAgNaP0u00BC+83xMIronMuaDTH/9cf4ocfJFvS7xw6TkFvr036g8WNAK6zZmjHN+/YZaxsaqzcC0kikwMMI2vgP4TdsTuVPpbI7kOI/FMLV44wyLruqp8bQeLheejyeucG9cnkutEMauIMCeSgBjFXyerSKrJ1zgKHb6+z4Fx6WgyfR53meToxJ3t3mgHrgRycgmF8EJTPwt7eQ65s/QYUfDMCO3Q0o38qzxwbM/yPWUllsoRVysZUrSFDpIqF5VzUHpikPr35w7S7g+iJI0veOMuvI9Z8/lD1TRKAKWDHD+g05/CrX9m9vOsrDt/PBxl9ZrmUaFaewNhNi3PikOZ+viSJgqmglB1DjDbOzIMRRoyuYEvr8qjI5aGB27e6FRLdvFX0t4Nu7CxxsE=) 2025-09-23 18:49:00.437628 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFHU3ytlerK////LjFGsziSr9FeOBZZhZVhBgwQ5QE/O) 2025-09-23 18:49:00.437693 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFgEhXAR+dLDTKwtcO8N5kU/VtM8tBnQVHaAvShOuhdwnV1fxAJfFRFiIQwmdcl0TfwuKtUfOTZT5E1ygzLy8bA=) 2025-09-23 18:49:00.437704 | orchestrator | 2025-09-23 18:49:00.437714 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:00.437724 | orchestrator | Tuesday 23 September 2025 18:48:56 +0000 (0:00:01.145) 0:00:10.032 ***** 2025-09-23 18:49:00.437738 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzMFrvla/VVKHNXIoH/EDMyOoNQhDqBE8sRT+N+OIuPaIIm3TIx+JXu9QMdA3UhrdAYaaEA0zLtKpv+Ssonwvf6O3u0rJSsHpm+jHVxsUMELgMteoBNiXe322P17t5ah9KfELfr+ox9RgxrqhGrgfYKjXb9Tey2y6QypbtTIXF7l/LAS3JPrQ9hISZrP52KTEf7Yne4iTdGR3Vr/oHP6fXaHKe8hCERS+fLNskjNglKbsMbPo/C8Rj5jNq0uNTtOaM3mez8gdRzUTbsivt31NraqRSrlfB3Gwcqqqe/7EwW4YXrHMmRZGfPgeSUN2gnHW6v/Lngi7APpe/eGTp+BsGBLHjLuDUBgRPsh0J93LmMnvWk6fOCo3QMoSaEy99UbfdaXlQyyH0dkaqu0XTnSl4dDSU3laG+esn5uL0z7/DHvZaqoJ6FhgheSIbVM7hbjbBXYl7wY05k7CVZ7MuULKkIAJ9uLbgs3X5JJ3OyXnledKk3+1DpIUBp3xzUHxU8iM=) 2025-09-23 18:49:00.437748 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA1fGfWlujwUz8kKa27gAatQW8NCHaQUoxGyQaFDbRVYbz5jxmmeGEKkHHGTdtbt3QyAxLBvQJ4yBZcHn+27lCg=) 2025-09-23 18:49:00.437759 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIsS30+FS6h+5tyQAvp2wjOBnUU+1n6voY3/QS3t7Fso) 2025-09-23 18:49:00.437769 | orchestrator | 2025-09-23 18:49:00.437778 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:00.437789 | orchestrator | Tuesday 23 September 2025 18:48:58 +0000 (0:00:01.172) 0:00:11.205 ***** 2025-09-23 18:49:00.437806 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKgGivis+OaLw6m9lEescIRqkbPZ2lWqOnQ1b8Tj0nqSNJZSEVLbiMfmYwIUZvzYNlSKhjIv6cT12g+yPg9AXTk=) 2025-09-23 18:49:00.437817 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF62dlsAnCMA3cqzMJzkiddLj8Z0cxNP0cVCcYaMwt7K) 2025-09-23 18:49:00.437827 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDO+L3xUrTj7aC/XUVnhonOCZWYcGR/HxXAAbbzmWQWOj/BlMr/vJpSildFhsLUs4r4OtMxkxJYvmWoV4eRqoE7AAv9paWp0KHRcQhGu/qbslYjzcbP+34pU98vkHeYAQXvr4ZdIIUGQfFcpyC/KHHwwsn41i1nNF2vuA+xWsH7+2VvQk5g07o3QjSbka67EyEKWgvvX+o9U6IaRKNkJ2W+1CB34KFjdg1vyh2eZjisO0S87ZR5IqhaU5tmUFnTYYqXEHDVB4niM//F4N3djeixwQFfESxRf/OqREC4hd1qzWA7w6AmKqhM2CmXLs1lN02ZF8yWv/Pqn3gDBWKuWvXdTSmt63CWOnKNoX0N0HsA56iUq7BLaIoNut7ASCpy8rJJ+7ENPsWsnSrOsqaaU3lUeM6/Li/LXJsl6QAlgyO53yZXH43OJIwTn5A2HywlfqBsFLkqXq1P1Qc6n5GC12FwR1UZupM8txNOaEwE9dcGD9yEeravgjQ1QkcAWERooF0=) 2025-09-23 18:49:00.437846 | orchestrator | 2025-09-23 18:49:00.437855 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:00.437863 | orchestrator | Tuesday 23 September 2025 18:48:59 +0000 (0:00:01.116) 0:00:12.321 ***** 2025-09-23 18:49:00.437880 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2iAk/ZhWdbqOOPqrItgvhagJIkjP3uN9sCyIoAO/VYrR/kjLJSXJMHgC0r4MvFoWCuAUrzu8QxudlF8wRUY35LiVxl/OADB/zvMXGbhOvSirqwo3zmhELpt+gLSF++UGcx0K3vBpS961vYfxfZIqd65lleME55cBdHLqmTdiDr/E8NkUmRkNFF0DoN9VuViXaWbWSVP1ZWrK7cKGjOObbQ25X3Zf/M7EaW1SPAahLH1vMGuWAaJ3DYALSscQ1/59nAcC1fJi95WxduYE+8+3tb6PmGgnlBApn9x0iUHm4NRrDa3EMhR+tOCClcqNwH6rTeDhYtvUF5l8B+rQMHZ0eNruoqYi+KgZYESITICMvPEXmxTGNJOSettHZk7ZiKWYdB9BDJ8p5y6nu953N2Fwbh+BMyMsEE/q++XHF2/lfe+wdWcxf9K6GCL6YKfyGTDF2Z8HCUsUByqIsIROjMEDckxcb7VZi3umkNQwuTBGtW6IKq3pZ9BIGod+jltd4Dn0=) 2025-09-23 18:49:11.704663 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOebNm2PeY4wYacPnwhYVg6A4AA4FpTO2S6iZwKQYXh2oNzq1TSf/6NfF+yZGPqoRYFEbwjEZs3PdGW17FNu7k4=) 2025-09-23 18:49:11.704739 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHc/ITN0M0vxZt96Eong/NBl4k86LQ27mQ+8s7SeLpCx) 2025-09-23 18:49:11.704748 | orchestrator | 2025-09-23 18:49:11.704753 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:11.704759 | orchestrator | Tuesday 23 September 2025 18:49:00 +0000 (0:00:01.163) 0:00:13.485 ***** 2025-09-23 18:49:11.704765 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDOXxVnEcYMAyhF4lJzmO0qlcxzDW8Ym31Pol6GMp44bEdQSDzLZC9r9id+EE2yDyf99s4nedWb/PhH4dnntHF7ICr4EI/yWjbVDWsQdVkxIJ/rbvbmov9+TQeAwnmymIz+ghroWyExwg2eKn9CqYqsDZ4idDiKuvBYs+R0Dq6QdlKB88nil5CayZDyrOw0vFrUPoExCfr4Rj6XPhNZM8wbh/mVSiPHFkA8ZuG9j4FFjvZf4dUxsSpz0oO6dQ9DpKr+Gn9HsKDXfF1C7hqXgKm7wWMu1MIjaNQfC7qgE3ZxV9Hxiiu2WvwIK57OaL5a6jXqsSXVFgKz9fueF0q4uPKYKspRT015j5AfmSBtOFMcGND5E7Q0W+U+GRVa2W8NMfeg9DPIciK1yszrrD9byyvJXxrhM001vCD9sRXj3k0cTjuHfHNSXLWmQA/ixd+yWVs4gJ0OgExb3LOYRGIRNpS2TEH+69uFjoxrt/Fx1IM/yVR9F8EbDVuljlNVPyBRec=) 2025-09-23 18:49:11.704770 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKl+3K9luOwrZSD7nYEhLPqx8ThqS/SMCwhdlMdjbny3) 2025-09-23 18:49:11.704775 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJZJPrQmwkHQWvaIxHFV6vyBsnP5EXrUhOH6OuEdSJF0xjoluCK9ZDUFxesezo7D33/E7mO787FFyqVWcnRzB8A=) 2025-09-23 18:49:11.704779 | orchestrator | 2025-09-23 18:49:11.704783 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2025-09-23 18:49:11.704788 | orchestrator | Tuesday 23 September 2025 18:49:01 +0000 (0:00:01.165) 0:00:14.651 ***** 2025-09-23 18:49:11.704793 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-09-23 18:49:11.704797 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-09-23 18:49:11.704801 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-09-23 18:49:11.704804 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-09-23 18:49:11.704808 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-09-23 18:49:11.704818 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-09-23 18:49:11.704823 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-09-23 18:49:11.704826 | orchestrator | 2025-09-23 18:49:11.704831 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2025-09-23 18:49:11.704835 | orchestrator | Tuesday 23 September 2025 18:49:06 +0000 (0:00:05.399) 0:00:20.051 ***** 2025-09-23 18:49:11.704840 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-09-23 18:49:11.704856 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-09-23 18:49:11.704860 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-09-23 18:49:11.704864 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-09-23 18:49:11.704867 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-09-23 18:49:11.704871 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-09-23 18:49:11.704875 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-09-23 18:49:11.704878 | orchestrator | 2025-09-23 18:49:11.704882 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:11.704886 | orchestrator | Tuesday 23 September 2025 18:49:07 +0000 (0:00:00.172) 0:00:20.223 ***** 2025-09-23 18:49:11.704902 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCoZwQ+DGtMEXh5nRCfbK579GbPt6js6LiQJAbw07NIg5VfeP8iScxTiDfMAXcMzfQO2CmUJ0RCHs1ivv5+UrtlT9GQL+DpukeRLrxD3FVIpDnikh8oAaUTkXa/bO2AvOCcU4pu08wNDv4JXSu7PVlDP60Yg4Mz9gNDZyLS2uGMNsM9cpOTJWw6+ejD1bamk9zyeB3eTNmRAqsCfLYyGQYOoqRnsWb+5c7qp8QIn+CnV5qCSwm6SXLED9sHK5H2BdWP2LJqxCqqdJE3pPzVUsa2W25w7iiPiwm3AaIhY9DLfIt0t29J/suQEjLLMVVrSjYmIdNThYMBs2tOV8wF3LqTGXjCp+MpEof9xgLJZCtD0jz83wjIbz0lpGZMVqYnXijPqv7HOpEvufG/NUKtmUwLMF2Xt070GtUqgqMc/nLEOUSiaUMNDAk/GDv1QLhLSja57LeZwKPdWeTojcUhIU0aS8JXHiJ2togh69ySniDoxKGybnCx3W4JYosBrQe5JgM=) 2025-09-23 18:49:11.704906 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOOgZTg1dlJZFlm4ubjv8rnb4AxiwUT2NfTjw0mccWfVge+ZmYx7YGqGViolHzLSFagzxsbVuYIhuhqc4iN5Lt8=) 2025-09-23 18:49:11.704911 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMD4aDXYTVaoTeZ3g71u2nSTmyvByhrWJDLzVv5Pn5mm) 2025-09-23 18:49:11.704914 | orchestrator | 2025-09-23 18:49:11.704918 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:11.704922 | orchestrator | Tuesday 23 September 2025 18:49:08 +0000 (0:00:01.125) 0:00:21.349 ***** 2025-09-23 18:49:11.704926 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0Af0hFBojIjkjLpN6BU9psmJ5nIY22u1nd0zEGg0+hil6o738WQ0RKSeXXI7vPbmq3mfUG6hfBVu/mX2V2TqOgrsdFa8+1pdNm5nM6YXbIGRPm8vFcwIh8Ytq+907PrvGiA/6C/h894011yDgXKBtPXjH/vTYXnvx1JwfUr0yqU6u5IGzxJyMsB/bA0UBXOIqDXD+2WETsqQh+vJID9/90oWUnlvhdb3uxqofu7vGzIZziE3IyyA3qfV7tvjYHnU5G5YkxUnB0AwzymlJ4IRnP14oC+6hyfIk6wYwmSH7J2Mg98HKyfaSMyHv+Ncisy17QsBoP6rgV9hgiZum5k1MCqFSPfWgxxtui6AUlVVgUNmFvwciuIFnlGsyvkierMBaYyVbDTilg2td4g6X8cldvv49MiwCZNnmdlfdNqc5Swa5YaC6RdAVOsu5ABqCMCENG4OL3XbH5jHteZgHg/TamgG8mIFSvcGLIkhXP6SVRs09kE4bcbUTl0DdsRNI2x8=) 2025-09-23 18:49:11.704930 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI8RzyMNXFR/RPqn7RmV/DItars9Mhb9ngDDiJ+QNpOcOLsE/4JbY/AsJRvBcNxpCziIjXPTQ2w1nA0BrgtS9lc=) 2025-09-23 18:49:11.704934 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAxUYylwluJag646xt76v+2upawbbqJGSyNyHAhR9YY9) 2025-09-23 18:49:11.704937 | orchestrator | 2025-09-23 18:49:11.704943 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:11.704947 | orchestrator | Tuesday 23 September 2025 18:49:09 +0000 (0:00:01.156) 0:00:22.505 ***** 2025-09-23 18:49:11.704951 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFgEhXAR+dLDTKwtcO8N5kU/VtM8tBnQVHaAvShOuhdwnV1fxAJfFRFiIQwmdcl0TfwuKtUfOTZT5E1ygzLy8bA=) 2025-09-23 18:49:11.704955 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFHU3ytlerK////LjFGsziSr9FeOBZZhZVhBgwQ5QE/O) 2025-09-23 18:49:11.704959 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2SVXKdAkYZIewLUkHzkKKWIsBuf6mF4htSl/FJQqgPPrCDLa09kDHrQo+3gPvse03OZLb4Gyfr/hScpkNsI4Tqi/sWmU8hAUfTvsUoaZpkKqaHULAgNaP0u00BC+83xMIronMuaDTH/9cf4ocfJFvS7xw6TkFvr036g8WNAK6zZmjHN+/YZaxsaqzcC0kikwMMI2vgP4TdsTuVPpbI7kOI/FMLV44wyLruqp8bQeLheejyeucG9cnkutEMauIMCeSgBjFXyerSKrJ1zgKHb6+z4Fx6WgyfR53meToxJ3t3mgHrgRycgmF8EJTPwt7eQ65s/QYUfDMCO3Q0o38qzxwbM/yPWUllsoRVysZUrSFDpIqF5VzUHpikPr35w7S7g+iJI0veOMuvI9Z8/lD1TRKAKWDHD+g05/CrX9m9vOsrDt/PBxl9ZrmUaFaewNhNi3PikOZ+viSJgqmglB1DjDbOzIMRRoyuYEvr8qjI5aGB27e6FRLdvFX0t4Nu7CxxsE=) 2025-09-23 18:49:11.704963 | orchestrator | 2025-09-23 18:49:11.704966 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:11.704970 | orchestrator | Tuesday 23 September 2025 18:49:10 +0000 (0:00:01.070) 0:00:23.576 ***** 2025-09-23 18:49:11.704976 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzMFrvla/VVKHNXIoH/EDMyOoNQhDqBE8sRT+N+OIuPaIIm3TIx+JXu9QMdA3UhrdAYaaEA0zLtKpv+Ssonwvf6O3u0rJSsHpm+jHVxsUMELgMteoBNiXe322P17t5ah9KfELfr+ox9RgxrqhGrgfYKjXb9Tey2y6QypbtTIXF7l/LAS3JPrQ9hISZrP52KTEf7Yne4iTdGR3Vr/oHP6fXaHKe8hCERS+fLNskjNglKbsMbPo/C8Rj5jNq0uNTtOaM3mez8gdRzUTbsivt31NraqRSrlfB3Gwcqqqe/7EwW4YXrHMmRZGfPgeSUN2gnHW6v/Lngi7APpe/eGTp+BsGBLHjLuDUBgRPsh0J93LmMnvWk6fOCo3QMoSaEy99UbfdaXlQyyH0dkaqu0XTnSl4dDSU3laG+esn5uL0z7/DHvZaqoJ6FhgheSIbVM7hbjbBXYl7wY05k7CVZ7MuULKkIAJ9uLbgs3X5JJ3OyXnledKk3+1DpIUBp3xzUHxU8iM=) 2025-09-23 18:49:11.704980 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIsS30+FS6h+5tyQAvp2wjOBnUU+1n6voY3/QS3t7Fso) 2025-09-23 18:49:11.704987 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA1fGfWlujwUz8kKa27gAatQW8NCHaQUoxGyQaFDbRVYbz5jxmmeGEKkHHGTdtbt3QyAxLBvQJ4yBZcHn+27lCg=) 2025-09-23 18:49:17.447176 | orchestrator | 2025-09-23 18:49:17.447378 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:17.447408 | orchestrator | Tuesday 23 September 2025 18:49:11 +0000 (0:00:01.178) 0:00:24.754 ***** 2025-09-23 18:49:17.447432 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDO+L3xUrTj7aC/XUVnhonOCZWYcGR/HxXAAbbzmWQWOj/BlMr/vJpSildFhsLUs4r4OtMxkxJYvmWoV4eRqoE7AAv9paWp0KHRcQhGu/qbslYjzcbP+34pU98vkHeYAQXvr4ZdIIUGQfFcpyC/KHHwwsn41i1nNF2vuA+xWsH7+2VvQk5g07o3QjSbka67EyEKWgvvX+o9U6IaRKNkJ2W+1CB34KFjdg1vyh2eZjisO0S87ZR5IqhaU5tmUFnTYYqXEHDVB4niM//F4N3djeixwQFfESxRf/OqREC4hd1qzWA7w6AmKqhM2CmXLs1lN02ZF8yWv/Pqn3gDBWKuWvXdTSmt63CWOnKNoX0N0HsA56iUq7BLaIoNut7ASCpy8rJJ+7ENPsWsnSrOsqaaU3lUeM6/Li/LXJsl6QAlgyO53yZXH43OJIwTn5A2HywlfqBsFLkqXq1P1Qc6n5GC12FwR1UZupM8txNOaEwE9dcGD9yEeravgjQ1QkcAWERooF0=) 2025-09-23 18:49:17.447458 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKgGivis+OaLw6m9lEescIRqkbPZ2lWqOnQ1b8Tj0nqSNJZSEVLbiMfmYwIUZvzYNlSKhjIv6cT12g+yPg9AXTk=) 2025-09-23 18:49:17.447479 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF62dlsAnCMA3cqzMJzkiddLj8Z0cxNP0cVCcYaMwt7K) 2025-09-23 18:49:17.447498 | orchestrator | 2025-09-23 18:49:17.447509 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:17.447538 | orchestrator | Tuesday 23 September 2025 18:49:12 +0000 (0:00:01.160) 0:00:25.915 ***** 2025-09-23 18:49:17.447572 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOebNm2PeY4wYacPnwhYVg6A4AA4FpTO2S6iZwKQYXh2oNzq1TSf/6NfF+yZGPqoRYFEbwjEZs3PdGW17FNu7k4=) 2025-09-23 18:49:17.447584 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2iAk/ZhWdbqOOPqrItgvhagJIkjP3uN9sCyIoAO/VYrR/kjLJSXJMHgC0r4MvFoWCuAUrzu8QxudlF8wRUY35LiVxl/OADB/zvMXGbhOvSirqwo3zmhELpt+gLSF++UGcx0K3vBpS961vYfxfZIqd65lleME55cBdHLqmTdiDr/E8NkUmRkNFF0DoN9VuViXaWbWSVP1ZWrK7cKGjOObbQ25X3Zf/M7EaW1SPAahLH1vMGuWAaJ3DYALSscQ1/59nAcC1fJi95WxduYE+8+3tb6PmGgnlBApn9x0iUHm4NRrDa3EMhR+tOCClcqNwH6rTeDhYtvUF5l8B+rQMHZ0eNruoqYi+KgZYESITICMvPEXmxTGNJOSettHZk7ZiKWYdB9BDJ8p5y6nu953N2Fwbh+BMyMsEE/q++XHF2/lfe+wdWcxf9K6GCL6YKfyGTDF2Z8HCUsUByqIsIROjMEDckxcb7VZi3umkNQwuTBGtW6IKq3pZ9BIGod+jltd4Dn0=) 2025-09-23 18:49:17.447596 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHc/ITN0M0vxZt96Eong/NBl4k86LQ27mQ+8s7SeLpCx) 2025-09-23 18:49:17.447607 | orchestrator | 2025-09-23 18:49:17.447618 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-23 18:49:17.447629 | orchestrator | Tuesday 23 September 2025 18:49:15 +0000 (0:00:02.177) 0:00:28.093 ***** 2025-09-23 18:49:17.447640 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDOXxVnEcYMAyhF4lJzmO0qlcxzDW8Ym31Pol6GMp44bEdQSDzLZC9r9id+EE2yDyf99s4nedWb/PhH4dnntHF7ICr4EI/yWjbVDWsQdVkxIJ/rbvbmov9+TQeAwnmymIz+ghroWyExwg2eKn9CqYqsDZ4idDiKuvBYs+R0Dq6QdlKB88nil5CayZDyrOw0vFrUPoExCfr4Rj6XPhNZM8wbh/mVSiPHFkA8ZuG9j4FFjvZf4dUxsSpz0oO6dQ9DpKr+Gn9HsKDXfF1C7hqXgKm7wWMu1MIjaNQfC7qgE3ZxV9Hxiiu2WvwIK57OaL5a6jXqsSXVFgKz9fueF0q4uPKYKspRT015j5AfmSBtOFMcGND5E7Q0W+U+GRVa2W8NMfeg9DPIciK1yszrrD9byyvJXxrhM001vCD9sRXj3k0cTjuHfHNSXLWmQA/ixd+yWVs4gJ0OgExb3LOYRGIRNpS2TEH+69uFjoxrt/Fx1IM/yVR9F8EbDVuljlNVPyBRec=) 2025-09-23 18:49:17.447651 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJZJPrQmwkHQWvaIxHFV6vyBsnP5EXrUhOH6OuEdSJF0xjoluCK9ZDUFxesezo7D33/E7mO787FFyqVWcnRzB8A=) 2025-09-23 18:49:17.447662 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKl+3K9luOwrZSD7nYEhLPqx8ThqS/SMCwhdlMdjbny3) 2025-09-23 18:49:17.447673 | orchestrator | 2025-09-23 18:49:17.447685 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2025-09-23 18:49:17.447698 | orchestrator | Tuesday 23 September 2025 18:49:16 +0000 (0:00:01.088) 0:00:29.181 ***** 2025-09-23 18:49:17.447710 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-09-23 18:49:17.447722 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-09-23 18:49:17.447734 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-09-23 18:49:17.447746 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-09-23 18:49:17.447756 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-09-23 18:49:17.447767 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-09-23 18:49:17.447777 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-09-23 18:49:17.447788 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:49:17.447799 | orchestrator | 2025-09-23 18:49:17.447829 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2025-09-23 18:49:17.447841 | orchestrator | Tuesday 23 September 2025 18:49:16 +0000 (0:00:00.194) 0:00:29.376 ***** 2025-09-23 18:49:17.447851 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:49:17.447862 | orchestrator | 2025-09-23 18:49:17.447873 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2025-09-23 18:49:17.447884 | orchestrator | Tuesday 23 September 2025 18:49:16 +0000 (0:00:00.069) 0:00:29.445 ***** 2025-09-23 18:49:17.447895 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:49:17.447906 | orchestrator | 2025-09-23 18:49:17.447924 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2025-09-23 18:49:17.447934 | orchestrator | Tuesday 23 September 2025 18:49:16 +0000 (0:00:00.056) 0:00:29.502 ***** 2025-09-23 18:49:17.447945 | orchestrator | changed: [testbed-manager] 2025-09-23 18:49:17.447956 | orchestrator | 2025-09-23 18:49:17.447967 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:49:17.447978 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-23 18:49:17.447989 | orchestrator | 2025-09-23 18:49:17.448000 | orchestrator | 2025-09-23 18:49:17.448011 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 18:49:17.448021 | orchestrator | Tuesday 23 September 2025 18:49:17 +0000 (0:00:00.706) 0:00:30.208 ***** 2025-09-23 18:49:17.448033 | orchestrator | =============================================================================== 2025-09-23 18:49:17.448044 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 6.18s 2025-09-23 18:49:17.448055 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.40s 2025-09-23 18:49:17.448066 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 2.18s 2025-09-23 18:49:17.448077 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.26s 2025-09-23 18:49:17.448088 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.18s 2025-09-23 18:49:17.448098 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.17s 2025-09-23 18:49:17.448109 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.17s 2025-09-23 18:49:17.448120 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.16s 2025-09-23 18:49:17.448130 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.16s 2025-09-23 18:49:17.448141 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.16s 2025-09-23 18:49:17.448159 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.15s 2025-09-23 18:49:17.448170 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.13s 2025-09-23 18:49:17.448181 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.12s 2025-09-23 18:49:17.448192 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.09s 2025-09-23 18:49:17.448202 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.07s 2025-09-23 18:49:17.448213 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.07s 2025-09-23 18:49:17.448256 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.71s 2025-09-23 18:49:17.448267 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.19s 2025-09-23 18:49:17.448278 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.19s 2025-09-23 18:49:17.448290 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.17s 2025-09-23 18:49:17.760160 | orchestrator | + osism apply squid 2025-09-23 18:49:29.831730 | orchestrator | 2025-09-23 18:49:29 | INFO  | Task dbffeddd-7cc2-47cd-a64c-3b0a3031e407 (squid) was prepared for execution. 2025-09-23 18:49:29.831839 | orchestrator | 2025-09-23 18:49:29 | INFO  | It takes a moment until task dbffeddd-7cc2-47cd-a64c-3b0a3031e407 (squid) has been started and output is visible here. 2025-09-23 18:51:23.100103 | orchestrator | 2025-09-23 18:51:23.100234 | orchestrator | PLAY [Apply role squid] ******************************************************** 2025-09-23 18:51:23.100249 | orchestrator | 2025-09-23 18:51:23.100259 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2025-09-23 18:51:23.100270 | orchestrator | Tuesday 23 September 2025 18:49:33 +0000 (0:00:00.146) 0:00:00.146 ***** 2025-09-23 18:51:23.100280 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2025-09-23 18:51:23.100320 | orchestrator | 2025-09-23 18:51:23.100331 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2025-09-23 18:51:23.100341 | orchestrator | Tuesday 23 September 2025 18:49:33 +0000 (0:00:00.073) 0:00:00.220 ***** 2025-09-23 18:51:23.100351 | orchestrator | ok: [testbed-manager] 2025-09-23 18:51:23.100362 | orchestrator | 2025-09-23 18:51:23.100372 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2025-09-23 18:51:23.100382 | orchestrator | Tuesday 23 September 2025 18:49:35 +0000 (0:00:01.260) 0:00:01.480 ***** 2025-09-23 18:51:23.100392 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2025-09-23 18:51:23.100401 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2025-09-23 18:51:23.100411 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2025-09-23 18:51:23.100421 | orchestrator | 2025-09-23 18:51:23.100431 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2025-09-23 18:51:23.100440 | orchestrator | Tuesday 23 September 2025 18:49:36 +0000 (0:00:01.046) 0:00:02.527 ***** 2025-09-23 18:51:23.100450 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2025-09-23 18:51:23.100460 | orchestrator | 2025-09-23 18:51:23.100470 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2025-09-23 18:51:23.100479 | orchestrator | Tuesday 23 September 2025 18:49:37 +0000 (0:00:01.003) 0:00:03.530 ***** 2025-09-23 18:51:23.100489 | orchestrator | ok: [testbed-manager] 2025-09-23 18:51:23.100499 | orchestrator | 2025-09-23 18:51:23.100508 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2025-09-23 18:51:23.100518 | orchestrator | Tuesday 23 September 2025 18:49:37 +0000 (0:00:00.324) 0:00:03.855 ***** 2025-09-23 18:51:23.100528 | orchestrator | changed: [testbed-manager] 2025-09-23 18:51:23.100538 | orchestrator | 2025-09-23 18:51:23.100547 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2025-09-23 18:51:23.100557 | orchestrator | Tuesday 23 September 2025 18:49:38 +0000 (0:00:00.855) 0:00:04.711 ***** 2025-09-23 18:51:23.100566 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2025-09-23 18:51:23.100577 | orchestrator | ok: [testbed-manager] 2025-09-23 18:51:23.100587 | orchestrator | 2025-09-23 18:51:23.100596 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2025-09-23 18:51:23.100606 | orchestrator | Tuesday 23 September 2025 18:50:09 +0000 (0:00:31.652) 0:00:36.364 ***** 2025-09-23 18:51:23.100616 | orchestrator | changed: [testbed-manager] 2025-09-23 18:51:23.100626 | orchestrator | 2025-09-23 18:51:23.100636 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2025-09-23 18:51:23.100645 | orchestrator | Tuesday 23 September 2025 18:50:22 +0000 (0:00:12.121) 0:00:48.486 ***** 2025-09-23 18:51:23.100656 | orchestrator | Pausing for 60 seconds 2025-09-23 18:51:23.100668 | orchestrator | changed: [testbed-manager] 2025-09-23 18:51:23.100679 | orchestrator | 2025-09-23 18:51:23.100690 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2025-09-23 18:51:23.100701 | orchestrator | Tuesday 23 September 2025 18:51:22 +0000 (0:01:00.071) 0:01:48.557 ***** 2025-09-23 18:51:23.100712 | orchestrator | ok: [testbed-manager] 2025-09-23 18:51:23.100723 | orchestrator | 2025-09-23 18:51:23.100733 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2025-09-23 18:51:23.100744 | orchestrator | Tuesday 23 September 2025 18:51:22 +0000 (0:00:00.060) 0:01:48.618 ***** 2025-09-23 18:51:23.100755 | orchestrator | changed: [testbed-manager] 2025-09-23 18:51:23.100766 | orchestrator | 2025-09-23 18:51:23.100777 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:51:23.100788 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 18:51:23.100798 | orchestrator | 2025-09-23 18:51:23.100809 | orchestrator | 2025-09-23 18:51:23.100828 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 18:51:23.100839 | orchestrator | Tuesday 23 September 2025 18:51:22 +0000 (0:00:00.633) 0:01:49.252 ***** 2025-09-23 18:51:23.100849 | orchestrator | =============================================================================== 2025-09-23 18:51:23.100860 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.07s 2025-09-23 18:51:23.100871 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 31.65s 2025-09-23 18:51:23.100882 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.12s 2025-09-23 18:51:23.100893 | orchestrator | osism.services.squid : Install required packages ------------------------ 1.26s 2025-09-23 18:51:23.100903 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.05s 2025-09-23 18:51:23.100914 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.00s 2025-09-23 18:51:23.100925 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 0.86s 2025-09-23 18:51:23.100935 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.63s 2025-09-23 18:51:23.100946 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.32s 2025-09-23 18:51:23.100957 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.07s 2025-09-23 18:51:23.100968 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.06s 2025-09-23 18:51:23.413109 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2025-09-23 18:51:23.413206 | orchestrator | ++ semver latest 9.0.0 2025-09-23 18:51:23.476953 | orchestrator | + [[ -1 -lt 0 ]] 2025-09-23 18:51:23.477168 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2025-09-23 18:51:23.477200 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2025-09-23 18:51:35.570685 | orchestrator | 2025-09-23 18:51:35 | INFO  | Task 74793bb9-74b4-40cd-b588-9ee4aa42945f (operator) was prepared for execution. 2025-09-23 18:51:35.570779 | orchestrator | 2025-09-23 18:51:35 | INFO  | It takes a moment until task 74793bb9-74b4-40cd-b588-9ee4aa42945f (operator) has been started and output is visible here. 2025-09-23 18:51:51.288293 | orchestrator | 2025-09-23 18:51:51.288402 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2025-09-23 18:51:51.288413 | orchestrator | 2025-09-23 18:51:51.288420 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-23 18:51:51.288427 | orchestrator | Tuesday 23 September 2025 18:51:39 +0000 (0:00:00.152) 0:00:00.152 ***** 2025-09-23 18:51:51.288434 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:51:51.288441 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:51:51.288447 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:51:51.288455 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:51:51.289280 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:51:51.289348 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:51:51.289362 | orchestrator | 2025-09-23 18:51:51.289377 | orchestrator | TASK [Do not require tty for all users] **************************************** 2025-09-23 18:51:51.289390 | orchestrator | Tuesday 23 September 2025 18:51:42 +0000 (0:00:03.344) 0:00:03.497 ***** 2025-09-23 18:51:51.289401 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:51:51.289412 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:51:51.289424 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:51:51.289435 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:51:51.289446 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:51:51.289457 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:51:51.289468 | orchestrator | 2025-09-23 18:51:51.289483 | orchestrator | PLAY [Apply role operator] ***************************************************** 2025-09-23 18:51:51.289495 | orchestrator | 2025-09-23 18:51:51.289506 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-09-23 18:51:51.289517 | orchestrator | Tuesday 23 September 2025 18:51:43 +0000 (0:00:00.735) 0:00:04.233 ***** 2025-09-23 18:51:51.289528 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:51:51.289539 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:51:51.289550 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:51:51.289593 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:51:51.289605 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:51:51.289616 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:51:51.289627 | orchestrator | 2025-09-23 18:51:51.289638 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-09-23 18:51:51.289649 | orchestrator | Tuesday 23 September 2025 18:51:43 +0000 (0:00:00.174) 0:00:04.407 ***** 2025-09-23 18:51:51.289660 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:51:51.289671 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:51:51.289682 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:51:51.289693 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:51:51.289703 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:51:51.289714 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:51:51.289725 | orchestrator | 2025-09-23 18:51:51.289753 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-09-23 18:51:51.289765 | orchestrator | Tuesday 23 September 2025 18:51:43 +0000 (0:00:00.159) 0:00:04.566 ***** 2025-09-23 18:51:51.289777 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:51:51.289789 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:51:51.289800 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:51:51.289811 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:51:51.289827 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:51:51.289838 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:51:51.289849 | orchestrator | 2025-09-23 18:51:51.289860 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-09-23 18:51:51.289872 | orchestrator | Tuesday 23 September 2025 18:51:44 +0000 (0:00:00.668) 0:00:05.235 ***** 2025-09-23 18:51:51.289883 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:51:51.289893 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:51:51.289904 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:51:51.289915 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:51:51.289926 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:51:51.289937 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:51:51.289948 | orchestrator | 2025-09-23 18:51:51.289959 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-09-23 18:51:51.289970 | orchestrator | Tuesday 23 September 2025 18:51:45 +0000 (0:00:00.783) 0:00:06.018 ***** 2025-09-23 18:51:51.289981 | orchestrator | changed: [testbed-node-0] => (item=adm) 2025-09-23 18:51:51.289992 | orchestrator | changed: [testbed-node-1] => (item=adm) 2025-09-23 18:51:51.290003 | orchestrator | changed: [testbed-node-2] => (item=adm) 2025-09-23 18:51:51.290072 | orchestrator | changed: [testbed-node-3] => (item=adm) 2025-09-23 18:51:51.290085 | orchestrator | changed: [testbed-node-5] => (item=adm) 2025-09-23 18:51:51.290120 | orchestrator | changed: [testbed-node-4] => (item=adm) 2025-09-23 18:51:51.290131 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2025-09-23 18:51:51.290142 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2025-09-23 18:51:51.290154 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2025-09-23 18:51:51.290164 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2025-09-23 18:51:51.290175 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2025-09-23 18:51:51.290187 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2025-09-23 18:51:51.290198 | orchestrator | 2025-09-23 18:51:51.290209 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-09-23 18:51:51.290220 | orchestrator | Tuesday 23 September 2025 18:51:46 +0000 (0:00:01.183) 0:00:07.202 ***** 2025-09-23 18:51:51.290230 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:51:51.290241 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:51:51.290252 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:51:51.290263 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:51:51.290274 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:51:51.290285 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:51:51.290295 | orchestrator | 2025-09-23 18:51:51.290307 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-09-23 18:51:51.290330 | orchestrator | Tuesday 23 September 2025 18:51:47 +0000 (0:00:01.321) 0:00:08.524 ***** 2025-09-23 18:51:51.290341 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2025-09-23 18:51:51.290352 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2025-09-23 18:51:51.290363 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2025-09-23 18:51:51.290374 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:51:51.290411 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:51:51.290423 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:51:51.290434 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:51:51.290444 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:51:51.290455 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2025-09-23 18:51:51.290466 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2025-09-23 18:51:51.290477 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2025-09-23 18:51:51.290488 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2025-09-23 18:51:51.290499 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2025-09-23 18:51:51.290509 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2025-09-23 18:51:51.290520 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2025-09-23 18:51:51.290531 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:51:51.290542 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:51:51.290553 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:51:51.290564 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:51:51.290575 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:51:51.290585 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2025-09-23 18:51:51.290596 | orchestrator | 2025-09-23 18:51:51.290607 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2025-09-23 18:51:51.290619 | orchestrator | Tuesday 23 September 2025 18:51:49 +0000 (0:00:01.314) 0:00:09.838 ***** 2025-09-23 18:51:51.290630 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:51:51.290641 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:51:51.290652 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:51:51.290663 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:51:51.290674 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:51:51.290685 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:51:51.290695 | orchestrator | 2025-09-23 18:51:51.290706 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-09-23 18:51:51.290717 | orchestrator | Tuesday 23 September 2025 18:51:49 +0000 (0:00:00.143) 0:00:09.983 ***** 2025-09-23 18:51:51.290728 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:51:51.290739 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:51:51.290750 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:51:51.290760 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:51:51.290771 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:51:51.290782 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:51:51.290793 | orchestrator | 2025-09-23 18:51:51.290804 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-09-23 18:51:51.290815 | orchestrator | Tuesday 23 September 2025 18:51:49 +0000 (0:00:00.569) 0:00:10.552 ***** 2025-09-23 18:51:51.290825 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:51:51.290836 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:51:51.290847 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:51:51.290858 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:51:51.290875 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:51:51.290886 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:51:51.290897 | orchestrator | 2025-09-23 18:51:51.290908 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-09-23 18:51:51.290919 | orchestrator | Tuesday 23 September 2025 18:51:50 +0000 (0:00:00.202) 0:00:10.755 ***** 2025-09-23 18:51:51.290930 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-23 18:51:51.290941 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:51:51.290952 | orchestrator | changed: [testbed-node-2] => (item=None) 2025-09-23 18:51:51.290963 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:51:51.290974 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-23 18:51:51.290984 | orchestrator | changed: [testbed-node-1] => (item=None) 2025-09-23 18:51:51.290995 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-23 18:51:51.291006 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:51:51.291017 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:51:51.291028 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:51:51.291039 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-23 18:51:51.291050 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:51:51.291061 | orchestrator | 2025-09-23 18:51:51.291072 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-09-23 18:51:51.291083 | orchestrator | Tuesday 23 September 2025 18:51:50 +0000 (0:00:00.677) 0:00:11.433 ***** 2025-09-23 18:51:51.291128 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:51:51.291140 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:51:51.291151 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:51:51.291162 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:51:51.291173 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:51:51.291184 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:51:51.291195 | orchestrator | 2025-09-23 18:51:51.291206 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-09-23 18:51:51.291217 | orchestrator | Tuesday 23 September 2025 18:51:50 +0000 (0:00:00.140) 0:00:11.573 ***** 2025-09-23 18:51:51.291228 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:51:51.291239 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:51:51.291249 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:51:51.291260 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:51:51.291271 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:51:51.291282 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:51:51.291293 | orchestrator | 2025-09-23 18:51:51.291304 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-09-23 18:51:51.291316 | orchestrator | Tuesday 23 September 2025 18:51:51 +0000 (0:00:00.174) 0:00:11.748 ***** 2025-09-23 18:51:51.291327 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:51:51.291338 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:51:51.291349 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:51:51.291360 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:51:51.291379 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:51:52.387508 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:51:52.387601 | orchestrator | 2025-09-23 18:51:52.387616 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-09-23 18:51:52.387629 | orchestrator | Tuesday 23 September 2025 18:51:51 +0000 (0:00:00.160) 0:00:11.908 ***** 2025-09-23 18:51:52.387640 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:51:52.387651 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:51:52.387662 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:51:52.387673 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:51:52.387684 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:51:52.387694 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:51:52.387706 | orchestrator | 2025-09-23 18:51:52.387717 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-09-23 18:51:52.387728 | orchestrator | Tuesday 23 September 2025 18:51:51 +0000 (0:00:00.661) 0:00:12.569 ***** 2025-09-23 18:51:52.387764 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:51:52.387775 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:51:52.387786 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:51:52.387797 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:51:52.387808 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:51:52.387818 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:51:52.387829 | orchestrator | 2025-09-23 18:51:52.387840 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:51:52.387852 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 18:51:52.387864 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 18:51:52.387875 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 18:51:52.387886 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 18:51:52.387913 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 18:51:52.387925 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 18:51:52.387936 | orchestrator | 2025-09-23 18:51:52.387947 | orchestrator | 2025-09-23 18:51:52.387962 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 18:51:52.387973 | orchestrator | Tuesday 23 September 2025 18:51:52 +0000 (0:00:00.220) 0:00:12.790 ***** 2025-09-23 18:51:52.387984 | orchestrator | =============================================================================== 2025-09-23 18:51:52.387994 | orchestrator | Gathering Facts --------------------------------------------------------- 3.34s 2025-09-23 18:51:52.388005 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.32s 2025-09-23 18:51:52.388016 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.31s 2025-09-23 18:51:52.388027 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.18s 2025-09-23 18:51:52.388038 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.78s 2025-09-23 18:51:52.388049 | orchestrator | Do not require tty for all users ---------------------------------------- 0.74s 2025-09-23 18:51:52.388061 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.68s 2025-09-23 18:51:52.388073 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.67s 2025-09-23 18:51:52.388085 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.66s 2025-09-23 18:51:52.388128 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.57s 2025-09-23 18:51:52.388140 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.22s 2025-09-23 18:51:52.388152 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.20s 2025-09-23 18:51:52.388164 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.17s 2025-09-23 18:51:52.388175 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.17s 2025-09-23 18:51:52.388187 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.16s 2025-09-23 18:51:52.388198 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.16s 2025-09-23 18:51:52.388210 | orchestrator | osism.commons.operator : Set custom environment variables in .bashrc configuration file --- 0.14s 2025-09-23 18:51:52.388222 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.14s 2025-09-23 18:51:52.716690 | orchestrator | + osism apply --environment custom facts 2025-09-23 18:51:54.581079 | orchestrator | 2025-09-23 18:51:54 | INFO  | Trying to run play facts in environment custom 2025-09-23 18:52:04.762412 | orchestrator | 2025-09-23 18:52:04 | INFO  | Task c3172458-2f9f-43ec-b57a-a130e8a7beca (facts) was prepared for execution. 2025-09-23 18:52:04.762521 | orchestrator | 2025-09-23 18:52:04 | INFO  | It takes a moment until task c3172458-2f9f-43ec-b57a-a130e8a7beca (facts) has been started and output is visible here. 2025-09-23 18:52:51.330517 | orchestrator | 2025-09-23 18:52:51.330686 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2025-09-23 18:52:51.330706 | orchestrator | 2025-09-23 18:52:51.330719 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-09-23 18:52:51.330731 | orchestrator | Tuesday 23 September 2025 18:52:08 +0000 (0:00:00.099) 0:00:00.099 ***** 2025-09-23 18:52:51.330742 | orchestrator | ok: [testbed-manager] 2025-09-23 18:52:51.330755 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:52:51.330766 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:52:51.330778 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:52:51.330789 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:52:51.330800 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:52:51.330811 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:52:51.330822 | orchestrator | 2025-09-23 18:52:51.330833 | orchestrator | TASK [Copy fact file] ********************************************************** 2025-09-23 18:52:51.330844 | orchestrator | Tuesday 23 September 2025 18:52:10 +0000 (0:00:01.509) 0:00:01.609 ***** 2025-09-23 18:52:51.330854 | orchestrator | ok: [testbed-manager] 2025-09-23 18:52:51.330865 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:52:51.330876 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:52:51.330887 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:52:51.330897 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:52:51.330960 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:52:51.330972 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:52:51.330983 | orchestrator | 2025-09-23 18:52:51.330994 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2025-09-23 18:52:51.331004 | orchestrator | 2025-09-23 18:52:51.331018 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-09-23 18:52:51.331029 | orchestrator | Tuesday 23 September 2025 18:52:11 +0000 (0:00:01.234) 0:00:02.843 ***** 2025-09-23 18:52:51.331109 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.331123 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.331135 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.331147 | orchestrator | 2025-09-23 18:52:51.331160 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-09-23 18:52:51.331172 | orchestrator | Tuesday 23 September 2025 18:52:11 +0000 (0:00:00.118) 0:00:02.961 ***** 2025-09-23 18:52:51.331184 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.331196 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.331207 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.331219 | orchestrator | 2025-09-23 18:52:51.331231 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-09-23 18:52:51.331243 | orchestrator | Tuesday 23 September 2025 18:52:11 +0000 (0:00:00.220) 0:00:03.182 ***** 2025-09-23 18:52:51.331270 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.331282 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.331304 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.331317 | orchestrator | 2025-09-23 18:52:51.331329 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-09-23 18:52:51.331357 | orchestrator | Tuesday 23 September 2025 18:52:11 +0000 (0:00:00.202) 0:00:03.384 ***** 2025-09-23 18:52:51.331372 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:52:51.331384 | orchestrator | 2025-09-23 18:52:51.331396 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-09-23 18:52:51.331430 | orchestrator | Tuesday 23 September 2025 18:52:12 +0000 (0:00:00.126) 0:00:03.510 ***** 2025-09-23 18:52:51.331474 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.331486 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.331497 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.331508 | orchestrator | 2025-09-23 18:52:51.331518 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-09-23 18:52:51.331529 | orchestrator | Tuesday 23 September 2025 18:52:12 +0000 (0:00:00.443) 0:00:03.954 ***** 2025-09-23 18:52:51.331540 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:52:51.331551 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:52:51.331562 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:52:51.331573 | orchestrator | 2025-09-23 18:52:51.331583 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-09-23 18:52:51.331594 | orchestrator | Tuesday 23 September 2025 18:52:12 +0000 (0:00:00.121) 0:00:04.076 ***** 2025-09-23 18:52:51.331605 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:52:51.331616 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:52:51.331626 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:52:51.331637 | orchestrator | 2025-09-23 18:52:51.331648 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-09-23 18:52:51.331659 | orchestrator | Tuesday 23 September 2025 18:52:13 +0000 (0:00:01.088) 0:00:05.164 ***** 2025-09-23 18:52:51.331669 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.331680 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.331691 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.331701 | orchestrator | 2025-09-23 18:52:51.331712 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-09-23 18:52:51.331723 | orchestrator | Tuesday 23 September 2025 18:52:14 +0000 (0:00:00.571) 0:00:05.735 ***** 2025-09-23 18:52:51.331734 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:52:51.331745 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:52:51.331755 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:52:51.331766 | orchestrator | 2025-09-23 18:52:51.331777 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-09-23 18:52:51.331788 | orchestrator | Tuesday 23 September 2025 18:52:15 +0000 (0:00:01.072) 0:00:06.808 ***** 2025-09-23 18:52:51.331799 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:52:51.331810 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:52:51.331820 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:52:51.331831 | orchestrator | 2025-09-23 18:52:51.331842 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2025-09-23 18:52:51.331853 | orchestrator | Tuesday 23 September 2025 18:52:33 +0000 (0:00:18.305) 0:00:25.114 ***** 2025-09-23 18:52:51.331863 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:52:51.331874 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:52:51.331885 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:52:51.331896 | orchestrator | 2025-09-23 18:52:51.331907 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2025-09-23 18:52:51.331937 | orchestrator | Tuesday 23 September 2025 18:52:33 +0000 (0:00:00.107) 0:00:25.221 ***** 2025-09-23 18:52:51.331948 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:52:51.331959 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:52:51.331970 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:52:51.331981 | orchestrator | 2025-09-23 18:52:51.331991 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-09-23 18:52:51.332002 | orchestrator | Tuesday 23 September 2025 18:52:41 +0000 (0:00:08.189) 0:00:33.411 ***** 2025-09-23 18:52:51.332013 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.332024 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.332035 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.332094 | orchestrator | 2025-09-23 18:52:51.332106 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-09-23 18:52:51.332117 | orchestrator | Tuesday 23 September 2025 18:52:42 +0000 (0:00:00.474) 0:00:33.885 ***** 2025-09-23 18:52:51.332137 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2025-09-23 18:52:51.332149 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2025-09-23 18:52:51.332160 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2025-09-23 18:52:51.332170 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2025-09-23 18:52:51.332181 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2025-09-23 18:52:51.332192 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2025-09-23 18:52:51.332231 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2025-09-23 18:52:51.332243 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2025-09-23 18:52:51.332254 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2025-09-23 18:52:51.332265 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2025-09-23 18:52:51.332275 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2025-09-23 18:52:51.332286 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2025-09-23 18:52:51.332297 | orchestrator | 2025-09-23 18:52:51.332307 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-09-23 18:52:51.332318 | orchestrator | Tuesday 23 September 2025 18:52:45 +0000 (0:00:03.521) 0:00:37.407 ***** 2025-09-23 18:52:51.332329 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.332340 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.332350 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.332361 | orchestrator | 2025-09-23 18:52:51.332372 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-23 18:52:51.332383 | orchestrator | 2025-09-23 18:52:51.332393 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-23 18:52:51.332404 | orchestrator | Tuesday 23 September 2025 18:52:47 +0000 (0:00:01.291) 0:00:38.698 ***** 2025-09-23 18:52:51.332415 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:52:51.332426 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:52:51.332437 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:52:51.332447 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:52:51.332458 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:52:51.332469 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:52:51.332479 | orchestrator | ok: [testbed-manager] 2025-09-23 18:52:51.332490 | orchestrator | 2025-09-23 18:52:51.332501 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 18:52:51.332512 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 18:52:51.332524 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 18:52:51.332535 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 18:52:51.332588 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 18:52:51.332600 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 18:52:51.332612 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 18:52:51.332623 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 18:52:51.332634 | orchestrator | 2025-09-23 18:52:51.332644 | orchestrator | 2025-09-23 18:52:51.332655 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 18:52:51.332674 | orchestrator | Tuesday 23 September 2025 18:52:51 +0000 (0:00:04.047) 0:00:42.746 ***** 2025-09-23 18:52:51.332685 | orchestrator | =============================================================================== 2025-09-23 18:52:51.332696 | orchestrator | osism.commons.repository : Update package cache ------------------------ 18.31s 2025-09-23 18:52:51.332706 | orchestrator | Install required packages (Debian) -------------------------------------- 8.19s 2025-09-23 18:52:51.332717 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.05s 2025-09-23 18:52:51.332728 | orchestrator | Copy fact files --------------------------------------------------------- 3.52s 2025-09-23 18:52:51.332738 | orchestrator | Create custom facts directory ------------------------------------------- 1.51s 2025-09-23 18:52:51.332749 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.29s 2025-09-23 18:52:51.332769 | orchestrator | Copy fact file ---------------------------------------------------------- 1.23s 2025-09-23 18:52:51.594542 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 1.09s 2025-09-23 18:52:51.594634 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 1.07s 2025-09-23 18:52:51.594648 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.57s 2025-09-23 18:52:51.594660 | orchestrator | Create custom facts directory ------------------------------------------- 0.47s 2025-09-23 18:52:51.594671 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.44s 2025-09-23 18:52:51.594681 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.22s 2025-09-23 18:52:51.594692 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.20s 2025-09-23 18:52:51.594703 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.13s 2025-09-23 18:52:51.594715 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.12s 2025-09-23 18:52:51.594726 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.12s 2025-09-23 18:52:51.594737 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.11s 2025-09-23 18:52:51.972225 | orchestrator | + osism apply bootstrap 2025-09-23 18:53:04.015488 | orchestrator | 2025-09-23 18:53:04 | INFO  | Task daf68a93-8d46-4c50-afd3-013bc48cf844 (bootstrap) was prepared for execution. 2025-09-23 18:53:04.015602 | orchestrator | 2025-09-23 18:53:04 | INFO  | It takes a moment until task daf68a93-8d46-4c50-afd3-013bc48cf844 (bootstrap) has been started and output is visible here. 2025-09-23 18:53:20.139860 | orchestrator | 2025-09-23 18:53:20.139977 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2025-09-23 18:53:20.140039 | orchestrator | 2025-09-23 18:53:20.140055 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2025-09-23 18:53:20.140075 | orchestrator | Tuesday 23 September 2025 18:53:08 +0000 (0:00:00.165) 0:00:00.165 ***** 2025-09-23 18:53:20.140094 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:20.140125 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:20.140143 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:20.140160 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:20.140178 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:20.140195 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:20.140213 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:20.140232 | orchestrator | 2025-09-23 18:53:20.140250 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-23 18:53:20.140269 | orchestrator | 2025-09-23 18:53:20.140307 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-23 18:53:20.140328 | orchestrator | Tuesday 23 September 2025 18:53:08 +0000 (0:00:00.263) 0:00:00.428 ***** 2025-09-23 18:53:20.140348 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:20.140367 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:20.140384 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:20.140404 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:20.140454 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:20.140473 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:20.140492 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:20.140510 | orchestrator | 2025-09-23 18:53:20.140529 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2025-09-23 18:53:20.140547 | orchestrator | 2025-09-23 18:53:20.140566 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-23 18:53:20.140585 | orchestrator | Tuesday 23 September 2025 18:53:12 +0000 (0:00:03.733) 0:00:04.162 ***** 2025-09-23 18:53:20.140605 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-09-23 18:53:20.140624 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-09-23 18:53:20.140642 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2025-09-23 18:53:20.140660 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-09-23 18:53:20.140678 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-23 18:53:20.140695 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2025-09-23 18:53:20.140713 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-23 18:53:20.140730 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-09-23 18:53:20.140750 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-09-23 18:53:20.140766 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2025-09-23 18:53:20.140777 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-23 18:53:20.140788 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2025-09-23 18:53:20.140799 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-09-23 18:53:20.140810 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-09-23 18:53:20.140820 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-09-23 18:53:20.140831 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2025-09-23 18:53:20.140841 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2025-09-23 18:53:20.140852 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-09-23 18:53:20.140865 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2025-09-23 18:53:20.140884 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-09-23 18:53:20.140902 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:20.140919 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2025-09-23 18:53:20.140937 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-23 18:53:20.140955 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:53:20.140975 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2025-09-23 18:53:20.141044 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2025-09-23 18:53:20.141066 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-23 18:53:20.141085 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2025-09-23 18:53:20.141105 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2025-09-23 18:53:20.141116 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-23 18:53:20.141127 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2025-09-23 18:53:20.141138 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2025-09-23 18:53:20.141149 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 18:53:20.141159 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2025-09-23 18:53:20.141170 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:53:20.141180 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2025-09-23 18:53:20.141191 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 18:53:20.141202 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-09-23 18:53:20.141212 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2025-09-23 18:53:20.141235 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2025-09-23 18:53:20.141253 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 18:53:20.141271 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:53:20.141288 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:53:20.141306 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-09-23 18:53:20.141324 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-09-23 18:53:20.141342 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-09-23 18:53:20.141359 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-09-23 18:53:20.141399 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2025-09-23 18:53:20.141417 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-09-23 18:53:20.141435 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2025-09-23 18:53:20.141454 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2025-09-23 18:53:20.141473 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2025-09-23 18:53:20.141491 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:53:20.141508 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2025-09-23 18:53:20.141525 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2025-09-23 18:53:20.141542 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:53:20.141560 | orchestrator | 2025-09-23 18:53:20.141579 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2025-09-23 18:53:20.141598 | orchestrator | 2025-09-23 18:53:20.141617 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2025-09-23 18:53:20.141636 | orchestrator | Tuesday 23 September 2025 18:53:12 +0000 (0:00:00.473) 0:00:04.636 ***** 2025-09-23 18:53:20.141654 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:20.141672 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:20.141689 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:20.141709 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:20.141727 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:20.141746 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:20.141764 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:20.141783 | orchestrator | 2025-09-23 18:53:20.141801 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2025-09-23 18:53:20.141820 | orchestrator | Tuesday 23 September 2025 18:53:13 +0000 (0:00:01.265) 0:00:05.901 ***** 2025-09-23 18:53:20.141839 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:20.141858 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:20.141877 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:20.141894 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:20.141912 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:20.141930 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:20.141947 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:20.141963 | orchestrator | 2025-09-23 18:53:20.141979 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2025-09-23 18:53:20.142120 | orchestrator | Tuesday 23 September 2025 18:53:15 +0000 (0:00:01.355) 0:00:07.257 ***** 2025-09-23 18:53:20.142144 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:53:20.142211 | orchestrator | 2025-09-23 18:53:20.142233 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2025-09-23 18:53:20.142253 | orchestrator | Tuesday 23 September 2025 18:53:15 +0000 (0:00:00.290) 0:00:07.548 ***** 2025-09-23 18:53:20.142271 | orchestrator | changed: [testbed-manager] 2025-09-23 18:53:20.142289 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:20.142308 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:53:20.142327 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:20.142346 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:53:20.142383 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:20.142404 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:53:20.142424 | orchestrator | 2025-09-23 18:53:20.142444 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2025-09-23 18:53:20.142463 | orchestrator | Tuesday 23 September 2025 18:53:17 +0000 (0:00:02.118) 0:00:09.666 ***** 2025-09-23 18:53:20.142483 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:20.142504 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:53:20.142525 | orchestrator | 2025-09-23 18:53:20.142543 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2025-09-23 18:53:20.142561 | orchestrator | Tuesday 23 September 2025 18:53:17 +0000 (0:00:00.292) 0:00:09.958 ***** 2025-09-23 18:53:20.142579 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:20.142597 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:20.142615 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:20.142633 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:53:20.142652 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:53:20.142670 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:53:20.142688 | orchestrator | 2025-09-23 18:53:20.142707 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2025-09-23 18:53:20.142725 | orchestrator | Tuesday 23 September 2025 18:53:18 +0000 (0:00:01.003) 0:00:10.962 ***** 2025-09-23 18:53:20.142743 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:20.142761 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:20.142780 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:20.142798 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:53:20.142813 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:20.142824 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:53:20.142835 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:53:20.142846 | orchestrator | 2025-09-23 18:53:20.142857 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2025-09-23 18:53:20.142868 | orchestrator | Tuesday 23 September 2025 18:53:19 +0000 (0:00:00.625) 0:00:11.587 ***** 2025-09-23 18:53:20.142882 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:53:20.142900 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:53:20.142918 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:53:20.142953 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:53:20.142972 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:53:20.143060 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:53:20.143084 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:20.143103 | orchestrator | 2025-09-23 18:53:20.143122 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-09-23 18:53:20.143143 | orchestrator | Tuesday 23 September 2025 18:53:19 +0000 (0:00:00.437) 0:00:12.024 ***** 2025-09-23 18:53:20.143162 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:20.143176 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:53:20.143204 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:53:33.061352 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:53:33.061460 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:53:33.061476 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:53:33.061487 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:53:33.061499 | orchestrator | 2025-09-23 18:53:33.061511 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-09-23 18:53:33.061524 | orchestrator | Tuesday 23 September 2025 18:53:20 +0000 (0:00:00.242) 0:00:12.267 ***** 2025-09-23 18:53:33.061537 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:53:33.061567 | orchestrator | 2025-09-23 18:53:33.061609 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-09-23 18:53:33.061622 | orchestrator | Tuesday 23 September 2025 18:53:20 +0000 (0:00:00.322) 0:00:12.590 ***** 2025-09-23 18:53:33.061634 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:53:33.061645 | orchestrator | 2025-09-23 18:53:33.061656 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-09-23 18:53:33.061666 | orchestrator | Tuesday 23 September 2025 18:53:20 +0000 (0:00:00.305) 0:00:12.895 ***** 2025-09-23 18:53:33.061677 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.061690 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.061700 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.061711 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.061722 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.061732 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.061743 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.061754 | orchestrator | 2025-09-23 18:53:33.061765 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-09-23 18:53:33.061775 | orchestrator | Tuesday 23 September 2025 18:53:22 +0000 (0:00:01.791) 0:00:14.687 ***** 2025-09-23 18:53:33.061786 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:33.061797 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:53:33.061807 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:53:33.061818 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:53:33.061829 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:53:33.061840 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:53:33.061850 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:53:33.061861 | orchestrator | 2025-09-23 18:53:33.061872 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-09-23 18:53:33.061884 | orchestrator | Tuesday 23 September 2025 18:53:22 +0000 (0:00:00.234) 0:00:14.921 ***** 2025-09-23 18:53:33.061897 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.061909 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.061921 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.061934 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.061946 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.061959 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.061997 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.062009 | orchestrator | 2025-09-23 18:53:33.062079 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-09-23 18:53:33.062093 | orchestrator | Tuesday 23 September 2025 18:53:23 +0000 (0:00:00.616) 0:00:15.537 ***** 2025-09-23 18:53:33.062105 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:33.062118 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:53:33.062130 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:53:33.062143 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:53:33.062156 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:53:33.062168 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:53:33.062180 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:53:33.062192 | orchestrator | 2025-09-23 18:53:33.062205 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-09-23 18:53:33.062219 | orchestrator | Tuesday 23 September 2025 18:53:23 +0000 (0:00:00.237) 0:00:15.775 ***** 2025-09-23 18:53:33.062232 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.062244 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:33.062255 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:33.062266 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:33.062277 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:53:33.062288 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:53:33.062298 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:53:33.062309 | orchestrator | 2025-09-23 18:53:33.062320 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-09-23 18:53:33.062340 | orchestrator | Tuesday 23 September 2025 18:53:24 +0000 (0:00:00.574) 0:00:16.349 ***** 2025-09-23 18:53:33.062351 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.062362 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:33.062373 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:33.062383 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:53:33.062394 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:53:33.062405 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:53:33.062416 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:33.062426 | orchestrator | 2025-09-23 18:53:33.062437 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-09-23 18:53:33.062448 | orchestrator | Tuesday 23 September 2025 18:53:25 +0000 (0:00:01.148) 0:00:17.497 ***** 2025-09-23 18:53:33.062459 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.062470 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.062481 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.062491 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.062502 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.062513 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.062524 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.062535 | orchestrator | 2025-09-23 18:53:33.062545 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-09-23 18:53:33.062557 | orchestrator | Tuesday 23 September 2025 18:53:26 +0000 (0:00:01.309) 0:00:18.807 ***** 2025-09-23 18:53:33.062586 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:53:33.062598 | orchestrator | 2025-09-23 18:53:33.062610 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-09-23 18:53:33.062621 | orchestrator | Tuesday 23 September 2025 18:53:27 +0000 (0:00:00.330) 0:00:19.138 ***** 2025-09-23 18:53:33.062632 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:33.062643 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:33.062654 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:33.062665 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:53:33.062675 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:33.062686 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:53:33.062702 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:53:33.062714 | orchestrator | 2025-09-23 18:53:33.062725 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-09-23 18:53:33.062735 | orchestrator | Tuesday 23 September 2025 18:53:28 +0000 (0:00:01.365) 0:00:20.503 ***** 2025-09-23 18:53:33.062746 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.062757 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.062768 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.062779 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.062789 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.062800 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.062811 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.062821 | orchestrator | 2025-09-23 18:53:33.062833 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-09-23 18:53:33.062844 | orchestrator | Tuesday 23 September 2025 18:53:28 +0000 (0:00:00.236) 0:00:20.739 ***** 2025-09-23 18:53:33.062854 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.062865 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.062876 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.062887 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.062897 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.062908 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.062919 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.062929 | orchestrator | 2025-09-23 18:53:33.062940 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-09-23 18:53:33.062951 | orchestrator | Tuesday 23 September 2025 18:53:28 +0000 (0:00:00.236) 0:00:20.976 ***** 2025-09-23 18:53:33.062991 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.063004 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.063014 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.063025 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.063036 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.063046 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.063057 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.063068 | orchestrator | 2025-09-23 18:53:33.063079 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-09-23 18:53:33.063089 | orchestrator | Tuesday 23 September 2025 18:53:29 +0000 (0:00:00.257) 0:00:21.233 ***** 2025-09-23 18:53:33.063101 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:53:33.063114 | orchestrator | 2025-09-23 18:53:33.063125 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-09-23 18:53:33.063136 | orchestrator | Tuesday 23 September 2025 18:53:29 +0000 (0:00:00.312) 0:00:21.546 ***** 2025-09-23 18:53:33.063146 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.063157 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.063168 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.063179 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.063189 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.063200 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.063211 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.063222 | orchestrator | 2025-09-23 18:53:33.063232 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-09-23 18:53:33.063243 | orchestrator | Tuesday 23 September 2025 18:53:30 +0000 (0:00:00.553) 0:00:22.100 ***** 2025-09-23 18:53:33.063254 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:53:33.063265 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:53:33.063276 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:53:33.063287 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:53:33.063297 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:53:33.063308 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:53:33.063319 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:53:33.063329 | orchestrator | 2025-09-23 18:53:33.063340 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-09-23 18:53:33.063351 | orchestrator | Tuesday 23 September 2025 18:53:30 +0000 (0:00:00.234) 0:00:22.335 ***** 2025-09-23 18:53:33.063362 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.063373 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:33.063384 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:53:33.063394 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:53:33.063405 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.063416 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.063427 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.063437 | orchestrator | 2025-09-23 18:53:33.063448 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-09-23 18:53:33.063459 | orchestrator | Tuesday 23 September 2025 18:53:31 +0000 (0:00:01.090) 0:00:23.425 ***** 2025-09-23 18:53:33.063470 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.063481 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:53:33.063491 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.063502 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.063513 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:53:33.063523 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:53:33.063534 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:53:33.063545 | orchestrator | 2025-09-23 18:53:33.063556 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-09-23 18:53:33.063566 | orchestrator | Tuesday 23 September 2025 18:53:31 +0000 (0:00:00.589) 0:00:24.015 ***** 2025-09-23 18:53:33.063577 | orchestrator | ok: [testbed-manager] 2025-09-23 18:53:33.063595 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:53:33.063606 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:53:33.063616 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:53:33.063635 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.995597 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:54:15.995710 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:54:15.995725 | orchestrator | 2025-09-23 18:54:15.995739 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-09-23 18:54:15.995751 | orchestrator | Tuesday 23 September 2025 18:53:33 +0000 (0:00:01.071) 0:00:25.087 ***** 2025-09-23 18:54:15.995763 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.995775 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.995786 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.995797 | orchestrator | changed: [testbed-manager] 2025-09-23 18:54:15.995808 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:54:15.995819 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:54:15.995830 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:54:15.995841 | orchestrator | 2025-09-23 18:54:15.995852 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2025-09-23 18:54:15.995863 | orchestrator | Tuesday 23 September 2025 18:53:51 +0000 (0:00:18.446) 0:00:43.533 ***** 2025-09-23 18:54:15.995874 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.995918 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.995930 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.995941 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.995951 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.995962 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.995973 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.995983 | orchestrator | 2025-09-23 18:54:15.995994 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2025-09-23 18:54:15.996005 | orchestrator | Tuesday 23 September 2025 18:53:51 +0000 (0:00:00.229) 0:00:43.762 ***** 2025-09-23 18:54:15.996016 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.996027 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.996038 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.996048 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.996059 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.996070 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.996081 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.996092 | orchestrator | 2025-09-23 18:54:15.996102 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2025-09-23 18:54:15.996113 | orchestrator | Tuesday 23 September 2025 18:53:51 +0000 (0:00:00.230) 0:00:43.993 ***** 2025-09-23 18:54:15.996124 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.996135 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.996146 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.996157 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.996168 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.996178 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.996189 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.996200 | orchestrator | 2025-09-23 18:54:15.996211 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2025-09-23 18:54:15.996222 | orchestrator | Tuesday 23 September 2025 18:53:52 +0000 (0:00:00.264) 0:00:44.257 ***** 2025-09-23 18:54:15.996234 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:54:15.996248 | orchestrator | 2025-09-23 18:54:15.996259 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2025-09-23 18:54:15.996270 | orchestrator | Tuesday 23 September 2025 18:53:52 +0000 (0:00:00.300) 0:00:44.557 ***** 2025-09-23 18:54:15.996281 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.996292 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.996302 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.996340 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.996351 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.996362 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.996372 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.996383 | orchestrator | 2025-09-23 18:54:15.996394 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2025-09-23 18:54:15.996404 | orchestrator | Tuesday 23 September 2025 18:53:54 +0000 (0:00:01.929) 0:00:46.487 ***** 2025-09-23 18:54:15.996415 | orchestrator | changed: [testbed-manager] 2025-09-23 18:54:15.996426 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:54:15.996437 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:54:15.996447 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:54:15.996458 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:54:15.996468 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:54:15.996479 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:54:15.996490 | orchestrator | 2025-09-23 18:54:15.996500 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2025-09-23 18:54:15.996530 | orchestrator | Tuesday 23 September 2025 18:53:55 +0000 (0:00:01.138) 0:00:47.626 ***** 2025-09-23 18:54:15.996541 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.996552 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.996563 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.996574 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.996584 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.996595 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.996606 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.996616 | orchestrator | 2025-09-23 18:54:15.996627 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2025-09-23 18:54:15.996638 | orchestrator | Tuesday 23 September 2025 18:53:56 +0000 (0:00:00.847) 0:00:48.473 ***** 2025-09-23 18:54:15.996650 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:54:15.996662 | orchestrator | 2025-09-23 18:54:15.996673 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2025-09-23 18:54:15.996685 | orchestrator | Tuesday 23 September 2025 18:53:56 +0000 (0:00:00.341) 0:00:48.815 ***** 2025-09-23 18:54:15.996696 | orchestrator | changed: [testbed-manager] 2025-09-23 18:54:15.996706 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:54:15.996717 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:54:15.996728 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:54:15.996738 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:54:15.996749 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:54:15.996759 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:54:15.996770 | orchestrator | 2025-09-23 18:54:15.996798 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2025-09-23 18:54:15.996810 | orchestrator | Tuesday 23 September 2025 18:53:57 +0000 (0:00:01.067) 0:00:49.882 ***** 2025-09-23 18:54:15.996820 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:54:15.996831 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:54:15.996842 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:54:15.996853 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:54:15.996863 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:54:15.996874 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:54:15.996909 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:54:15.996920 | orchestrator | 2025-09-23 18:54:15.996931 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2025-09-23 18:54:15.996947 | orchestrator | Tuesday 23 September 2025 18:53:58 +0000 (0:00:00.306) 0:00:50.189 ***** 2025-09-23 18:54:15.996958 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:54:15.996968 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:54:15.996979 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:54:15.996990 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:54:15.997010 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:54:15.997021 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:54:15.997031 | orchestrator | changed: [testbed-manager] 2025-09-23 18:54:15.997042 | orchestrator | 2025-09-23 18:54:15.997053 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2025-09-23 18:54:15.997064 | orchestrator | Tuesday 23 September 2025 18:54:10 +0000 (0:00:12.024) 0:01:02.213 ***** 2025-09-23 18:54:15.997074 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.997085 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.997096 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.997107 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.997117 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.997128 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.997139 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.997149 | orchestrator | 2025-09-23 18:54:15.997160 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2025-09-23 18:54:15.997171 | orchestrator | Tuesday 23 September 2025 18:54:11 +0000 (0:00:01.502) 0:01:03.715 ***** 2025-09-23 18:54:15.997182 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.997193 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.997203 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.997214 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.997225 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.997235 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.997246 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.997256 | orchestrator | 2025-09-23 18:54:15.997267 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2025-09-23 18:54:15.997278 | orchestrator | Tuesday 23 September 2025 18:54:12 +0000 (0:00:00.883) 0:01:04.599 ***** 2025-09-23 18:54:15.997289 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.997299 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.997310 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.997321 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.997331 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.997342 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.997353 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.997363 | orchestrator | 2025-09-23 18:54:15.997374 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2025-09-23 18:54:15.997385 | orchestrator | Tuesday 23 September 2025 18:54:12 +0000 (0:00:00.253) 0:01:04.853 ***** 2025-09-23 18:54:15.997396 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.997407 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.997417 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.997428 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.997438 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.997449 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.997460 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.997470 | orchestrator | 2025-09-23 18:54:15.997481 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2025-09-23 18:54:15.997492 | orchestrator | Tuesday 23 September 2025 18:54:13 +0000 (0:00:00.245) 0:01:05.098 ***** 2025-09-23 18:54:15.997503 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:54:15.997514 | orchestrator | 2025-09-23 18:54:15.997525 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2025-09-23 18:54:15.997536 | orchestrator | Tuesday 23 September 2025 18:54:13 +0000 (0:00:00.304) 0:01:05.403 ***** 2025-09-23 18:54:15.997546 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.997557 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.997568 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.997579 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.997589 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.997600 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.997611 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.997628 | orchestrator | 2025-09-23 18:54:15.997639 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2025-09-23 18:54:15.997650 | orchestrator | Tuesday 23 September 2025 18:54:15 +0000 (0:00:01.689) 0:01:07.093 ***** 2025-09-23 18:54:15.997660 | orchestrator | changed: [testbed-manager] 2025-09-23 18:54:15.997671 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:54:15.997682 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:54:15.997693 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:54:15.997703 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:54:15.997714 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:54:15.997725 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:54:15.997736 | orchestrator | 2025-09-23 18:54:15.997747 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2025-09-23 18:54:15.997757 | orchestrator | Tuesday 23 September 2025 18:54:15 +0000 (0:00:00.673) 0:01:07.766 ***** 2025-09-23 18:54:15.997768 | orchestrator | ok: [testbed-manager] 2025-09-23 18:54:15.997779 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:54:15.997790 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:54:15.997800 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:54:15.997811 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:54:15.997822 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:54:15.997832 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:54:15.997843 | orchestrator | 2025-09-23 18:54:15.997860 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2025-09-23 18:56:32.015910 | orchestrator | Tuesday 23 September 2025 18:54:15 +0000 (0:00:00.254) 0:01:08.021 ***** 2025-09-23 18:56:32.016031 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:32.016047 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:32.016083 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:32.016095 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:32.016106 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:32.016116 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:32.016127 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:32.016138 | orchestrator | 2025-09-23 18:56:32.016150 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2025-09-23 18:56:32.016161 | orchestrator | Tuesday 23 September 2025 18:54:17 +0000 (0:00:01.265) 0:01:09.286 ***** 2025-09-23 18:56:32.016172 | orchestrator | changed: [testbed-manager] 2025-09-23 18:56:32.016183 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:56:32.016212 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:56:32.016223 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:56:32.016234 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:56:32.016244 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:56:32.016255 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:56:32.016266 | orchestrator | 2025-09-23 18:56:32.016278 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2025-09-23 18:56:32.016289 | orchestrator | Tuesday 23 September 2025 18:54:19 +0000 (0:00:02.335) 0:01:11.622 ***** 2025-09-23 18:56:32.016299 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:32.016310 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:32.016321 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:32.016331 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:32.016342 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:32.016353 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:32.016364 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:32.016374 | orchestrator | 2025-09-23 18:56:32.016385 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2025-09-23 18:56:32.016396 | orchestrator | Tuesday 23 September 2025 18:54:22 +0000 (0:00:02.442) 0:01:14.065 ***** 2025-09-23 18:56:32.016407 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:32.016417 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:32.016428 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:32.016439 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:32.016452 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:32.016464 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:32.016499 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:32.016511 | orchestrator | 2025-09-23 18:56:32.016523 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2025-09-23 18:56:32.016535 | orchestrator | Tuesday 23 September 2025 18:55:01 +0000 (0:00:39.492) 0:01:53.558 ***** 2025-09-23 18:56:32.016546 | orchestrator | changed: [testbed-manager] 2025-09-23 18:56:32.016556 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:56:32.016567 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:56:32.016578 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:56:32.016588 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:56:32.016599 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:56:32.016610 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:56:32.016620 | orchestrator | 2025-09-23 18:56:32.016631 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2025-09-23 18:56:32.016642 | orchestrator | Tuesday 23 September 2025 18:56:17 +0000 (0:01:15.613) 0:03:09.171 ***** 2025-09-23 18:56:32.016652 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:32.016663 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:32.016730 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:32.016742 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:32.016752 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:32.016763 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:32.016773 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:32.016784 | orchestrator | 2025-09-23 18:56:32.016796 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2025-09-23 18:56:32.016808 | orchestrator | Tuesday 23 September 2025 18:56:18 +0000 (0:00:01.703) 0:03:10.874 ***** 2025-09-23 18:56:32.016818 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:32.016829 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:32.016840 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:32.016850 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:32.016861 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:32.016872 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:32.016882 | orchestrator | changed: [testbed-manager] 2025-09-23 18:56:32.016893 | orchestrator | 2025-09-23 18:56:32.016904 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2025-09-23 18:56:32.016914 | orchestrator | Tuesday 23 September 2025 18:56:30 +0000 (0:00:11.885) 0:03:22.760 ***** 2025-09-23 18:56:32.016934 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2025-09-23 18:56:32.016951 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2025-09-23 18:56:32.016985 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2025-09-23 18:56:32.017005 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2025-09-23 18:56:32.017026 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'network', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2025-09-23 18:56:32.017037 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2025-09-23 18:56:32.017048 | orchestrator | 2025-09-23 18:56:32.017059 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2025-09-23 18:56:32.017070 | orchestrator | Tuesday 23 September 2025 18:56:31 +0000 (0:00:00.370) 0:03:23.130 ***** 2025-09-23 18:56:32.017081 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-23 18:56:32.017092 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:32.017104 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-23 18:56:32.017114 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:56:32.017125 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-23 18:56:32.017136 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:56:32.017146 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-23 18:56:32.017157 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:56:32.017168 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-23 18:56:32.017179 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-23 18:56:32.017190 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-23 18:56:32.017201 | orchestrator | 2025-09-23 18:56:32.017212 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2025-09-23 18:56:32.017222 | orchestrator | Tuesday 23 September 2025 18:56:31 +0000 (0:00:00.679) 0:03:23.810 ***** 2025-09-23 18:56:32.017233 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-23 18:56:32.017245 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-23 18:56:32.017256 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-23 18:56:32.017267 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-23 18:56:32.017277 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-23 18:56:32.017288 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-23 18:56:32.017299 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-23 18:56:32.017310 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-23 18:56:32.017321 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-23 18:56:32.017332 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-23 18:56:32.017343 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:32.017354 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-23 18:56:32.017365 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-23 18:56:32.017382 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-23 18:56:32.017393 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-23 18:56:32.017411 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-23 18:56:32.017422 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-23 18:56:32.017434 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-23 18:56:32.017452 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-23 18:56:41.288791 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-23 18:56:41.288904 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-23 18:56:41.288922 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-23 18:56:41.288936 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-23 18:56:41.288963 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-23 18:56:41.288975 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-23 18:56:41.288986 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-23 18:56:41.288998 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-23 18:56:41.289010 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-23 18:56:41.289022 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-23 18:56:41.289039 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-23 18:56:41.289051 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-23 18:56:41.289063 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:56:41.289075 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:56:41.289087 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-23 18:56:41.289098 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-23 18:56:41.289110 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-23 18:56:41.289121 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-23 18:56:41.289132 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-23 18:56:41.289143 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-23 18:56:41.289154 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-23 18:56:41.289166 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-23 18:56:41.289177 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-23 18:56:41.289188 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-23 18:56:41.289200 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:56:41.289211 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-09-23 18:56:41.289222 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-09-23 18:56:41.289257 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-09-23 18:56:41.289269 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-09-23 18:56:41.289280 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-09-23 18:56:41.289291 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-09-23 18:56:41.289302 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-09-23 18:56:41.289313 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-09-23 18:56:41.289325 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-09-23 18:56:41.289335 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-09-23 18:56:41.289347 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-09-23 18:56:41.289358 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-09-23 18:56:41.289369 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-09-23 18:56:41.289380 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-09-23 18:56:41.289392 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-09-23 18:56:41.289403 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-09-23 18:56:41.289414 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-09-23 18:56:41.289426 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-09-23 18:56:41.289456 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-09-23 18:56:41.289468 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-09-23 18:56:41.289479 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-09-23 18:56:41.289491 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-09-23 18:56:41.289502 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-09-23 18:56:41.289518 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-09-23 18:56:41.289529 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-09-23 18:56:41.289540 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-09-23 18:56:41.289551 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-09-23 18:56:41.289563 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-09-23 18:56:41.289574 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-09-23 18:56:41.289586 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-09-23 18:56:41.289597 | orchestrator | 2025-09-23 18:56:41.289609 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2025-09-23 18:56:41.289620 | orchestrator | Tuesday 23 September 2025 18:56:36 +0000 (0:00:04.725) 0:03:28.536 ***** 2025-09-23 18:56:41.289631 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289642 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289695 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289720 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289731 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289742 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289752 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-23 18:56:41.289763 | orchestrator | 2025-09-23 18:56:41.289774 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2025-09-23 18:56:41.289785 | orchestrator | Tuesday 23 September 2025 18:56:38 +0000 (0:00:01.558) 0:03:30.095 ***** 2025-09-23 18:56:41.289795 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289806 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:41.289817 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289828 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289839 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:56:41.289849 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289860 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:56:41.289871 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:56:41.289882 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-23 18:56:41.289893 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-23 18:56:41.289904 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-23 18:56:41.289915 | orchestrator | 2025-09-23 18:56:41.289926 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on network] ***************** 2025-09-23 18:56:41.289937 | orchestrator | Tuesday 23 September 2025 18:56:39 +0000 (0:00:01.675) 0:03:31.770 ***** 2025-09-23 18:56:41.289947 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289958 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:41.289969 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289980 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.289990 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:56:41.290001 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:56:41.290012 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-23 18:56:41.290085 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:56:41.290097 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-23 18:56:41.290107 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-23 18:56:41.290118 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-23 18:56:41.290129 | orchestrator | 2025-09-23 18:56:41.290148 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2025-09-23 18:56:53.824036 | orchestrator | Tuesday 23 September 2025 18:56:41 +0000 (0:00:01.545) 0:03:33.315 ***** 2025-09-23 18:56:53.824121 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-23 18:56:53.824132 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:53.824141 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-23 18:56:53.824149 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-23 18:56:53.824186 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:56:53.824195 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-23 18:56:53.824202 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:56:53.824210 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:56:53.824217 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-09-23 18:56:53.824225 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-09-23 18:56:53.824232 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-09-23 18:56:53.824239 | orchestrator | 2025-09-23 18:56:53.824247 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2025-09-23 18:56:53.824255 | orchestrator | Tuesday 23 September 2025 18:56:42 +0000 (0:00:00.752) 0:03:34.068 ***** 2025-09-23 18:56:53.824262 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:53.824269 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:56:53.824277 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:56:53.824284 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:56:53.824291 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:56:53.824299 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:56:53.824306 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:56:53.824313 | orchestrator | 2025-09-23 18:56:53.824320 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2025-09-23 18:56:53.824328 | orchestrator | Tuesday 23 September 2025 18:56:42 +0000 (0:00:00.326) 0:03:34.395 ***** 2025-09-23 18:56:53.824335 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:53.824343 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:53.824351 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:53.824358 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:53.824365 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:53.824372 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:53.824380 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:53.824387 | orchestrator | 2025-09-23 18:56:53.824395 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2025-09-23 18:56:53.824402 | orchestrator | Tuesday 23 September 2025 18:56:47 +0000 (0:00:05.504) 0:03:39.900 ***** 2025-09-23 18:56:53.824410 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2025-09-23 18:56:53.824418 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2025-09-23 18:56:53.824425 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:56:53.824432 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2025-09-23 18:56:53.824439 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:56:53.824447 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2025-09-23 18:56:53.824454 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:56:53.824461 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2025-09-23 18:56:53.824468 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:56:53.824475 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:56:53.824483 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2025-09-23 18:56:53.824490 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:56:53.824497 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2025-09-23 18:56:53.824504 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:56:53.824511 | orchestrator | 2025-09-23 18:56:53.824519 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2025-09-23 18:56:53.824526 | orchestrator | Tuesday 23 September 2025 18:56:48 +0000 (0:00:00.287) 0:03:40.188 ***** 2025-09-23 18:56:53.824533 | orchestrator | ok: [testbed-manager] => (item=cron) 2025-09-23 18:56:53.824541 | orchestrator | ok: [testbed-node-0] => (item=cron) 2025-09-23 18:56:53.824548 | orchestrator | ok: [testbed-node-1] => (item=cron) 2025-09-23 18:56:53.824555 | orchestrator | ok: [testbed-node-2] => (item=cron) 2025-09-23 18:56:53.824568 | orchestrator | ok: [testbed-node-3] => (item=cron) 2025-09-23 18:56:53.824575 | orchestrator | ok: [testbed-node-4] => (item=cron) 2025-09-23 18:56:53.824583 | orchestrator | ok: [testbed-node-5] => (item=cron) 2025-09-23 18:56:53.824590 | orchestrator | 2025-09-23 18:56:53.824597 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2025-09-23 18:56:53.824605 | orchestrator | Tuesday 23 September 2025 18:56:49 +0000 (0:00:01.002) 0:03:41.190 ***** 2025-09-23 18:56:53.824613 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:56:53.824623 | orchestrator | 2025-09-23 18:56:53.824630 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2025-09-23 18:56:53.824667 | orchestrator | Tuesday 23 September 2025 18:56:49 +0000 (0:00:00.514) 0:03:41.705 ***** 2025-09-23 18:56:53.824675 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:53.824682 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:53.824690 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:53.824697 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:53.824704 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:53.824712 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:53.824719 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:53.824726 | orchestrator | 2025-09-23 18:56:53.824734 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2025-09-23 18:56:53.824741 | orchestrator | Tuesday 23 September 2025 18:56:51 +0000 (0:00:01.353) 0:03:43.058 ***** 2025-09-23 18:56:53.824748 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:53.824769 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:53.824777 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:53.824784 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:53.824792 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:53.824799 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:53.824806 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:53.824813 | orchestrator | 2025-09-23 18:56:53.824820 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2025-09-23 18:56:53.824828 | orchestrator | Tuesday 23 September 2025 18:56:51 +0000 (0:00:00.623) 0:03:43.682 ***** 2025-09-23 18:56:53.824835 | orchestrator | changed: [testbed-manager] 2025-09-23 18:56:53.824842 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:56:53.824850 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:56:53.824857 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:56:53.824865 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:56:53.824872 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:56:53.824879 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:56:53.824887 | orchestrator | 2025-09-23 18:56:53.824894 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2025-09-23 18:56:53.824901 | orchestrator | Tuesday 23 September 2025 18:56:52 +0000 (0:00:00.617) 0:03:44.299 ***** 2025-09-23 18:56:53.824909 | orchestrator | ok: [testbed-manager] 2025-09-23 18:56:53.824916 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:56:53.824923 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:56:53.824930 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:56:53.824938 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:56:53.824945 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:56:53.824952 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:56:53.824959 | orchestrator | 2025-09-23 18:56:53.824967 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2025-09-23 18:56:53.824974 | orchestrator | Tuesday 23 September 2025 18:56:52 +0000 (0:00:00.557) 0:03:44.856 ***** 2025-09-23 18:56:53.824984 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652350.2333994, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:56:53.825005 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652389.5180185, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:56:53.825013 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652385.004759, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:56:53.825021 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652385.801471, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:56:53.825029 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652394.8072722, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:56:53.825049 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652386.1997905, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285509 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758652395.4217832, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285671 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285712 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285725 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285737 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285749 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285760 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285806 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 18:57:09.285820 | orchestrator | 2025-09-23 18:57:09.285833 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2025-09-23 18:57:09.285845 | orchestrator | Tuesday 23 September 2025 18:56:53 +0000 (0:00:00.992) 0:03:45.849 ***** 2025-09-23 18:57:09.285857 | orchestrator | changed: [testbed-manager] 2025-09-23 18:57:09.285877 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:57:09.285888 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:57:09.285899 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:57:09.285910 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:57:09.285921 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:57:09.285931 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:57:09.285942 | orchestrator | 2025-09-23 18:57:09.285953 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2025-09-23 18:57:09.285964 | orchestrator | Tuesday 23 September 2025 18:56:54 +0000 (0:00:01.186) 0:03:47.036 ***** 2025-09-23 18:57:09.285975 | orchestrator | changed: [testbed-manager] 2025-09-23 18:57:09.285986 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:57:09.285996 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:57:09.286007 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:57:09.286077 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:57:09.286090 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:57:09.286101 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:57:09.286111 | orchestrator | 2025-09-23 18:57:09.286122 | orchestrator | TASK [osism.commons.motd : Copy issue.net file] ******************************** 2025-09-23 18:57:09.286133 | orchestrator | Tuesday 23 September 2025 18:56:56 +0000 (0:00:01.350) 0:03:48.387 ***** 2025-09-23 18:57:09.286144 | orchestrator | changed: [testbed-manager] 2025-09-23 18:57:09.286155 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:57:09.286166 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:57:09.286176 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:57:09.286187 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:57:09.286198 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:57:09.286209 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:57:09.286236 | orchestrator | 2025-09-23 18:57:09.286259 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2025-09-23 18:57:09.286270 | orchestrator | Tuesday 23 September 2025 18:56:57 +0000 (0:00:01.181) 0:03:49.569 ***** 2025-09-23 18:57:09.286280 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:57:09.286291 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:57:09.286302 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:57:09.286313 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:57:09.286323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:57:09.286334 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:57:09.286345 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:57:09.286356 | orchestrator | 2025-09-23 18:57:09.286367 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2025-09-23 18:57:09.286378 | orchestrator | Tuesday 23 September 2025 18:56:57 +0000 (0:00:00.262) 0:03:49.831 ***** 2025-09-23 18:57:09.286388 | orchestrator | ok: [testbed-manager] 2025-09-23 18:57:09.286401 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:57:09.286412 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:57:09.286422 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:57:09.286433 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:57:09.286444 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:57:09.286454 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:57:09.286465 | orchestrator | 2025-09-23 18:57:09.286476 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2025-09-23 18:57:09.286487 | orchestrator | Tuesday 23 September 2025 18:56:58 +0000 (0:00:00.767) 0:03:50.599 ***** 2025-09-23 18:57:09.286499 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:57:09.286512 | orchestrator | 2025-09-23 18:57:09.286523 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2025-09-23 18:57:09.286534 | orchestrator | Tuesday 23 September 2025 18:56:58 +0000 (0:00:00.411) 0:03:51.010 ***** 2025-09-23 18:57:09.286545 | orchestrator | ok: [testbed-manager] 2025-09-23 18:57:09.286556 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:57:09.286574 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:57:09.286585 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:57:09.286596 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:57:09.286607 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:57:09.286618 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:57:09.286647 | orchestrator | 2025-09-23 18:57:09.286660 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2025-09-23 18:57:09.286672 | orchestrator | Tuesday 23 September 2025 18:57:07 +0000 (0:00:08.036) 0:03:59.046 ***** 2025-09-23 18:57:09.286683 | orchestrator | ok: [testbed-manager] 2025-09-23 18:57:09.286695 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:57:09.286707 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:57:09.286718 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:57:09.286730 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:57:09.286741 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:57:09.286753 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:57:09.286764 | orchestrator | 2025-09-23 18:57:09.286776 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2025-09-23 18:57:09.286788 | orchestrator | Tuesday 23 September 2025 18:57:08 +0000 (0:00:01.252) 0:04:00.299 ***** 2025-09-23 18:57:09.286800 | orchestrator | ok: [testbed-manager] 2025-09-23 18:57:09.286811 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:57:09.286823 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:57:09.286834 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:57:09.286846 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:57:09.286857 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:57:09.286869 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:57:09.286880 | orchestrator | 2025-09-23 18:57:09.286907 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2025-09-23 18:58:19.007853 | orchestrator | Tuesday 23 September 2025 18:57:09 +0000 (0:00:01.007) 0:04:01.307 ***** 2025-09-23 18:58:19.007986 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:19.007996 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:19.008004 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:19.008010 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:19.008018 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:19.008024 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:19.008031 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:19.008039 | orchestrator | 2025-09-23 18:58:19.008048 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2025-09-23 18:58:19.008057 | orchestrator | Tuesday 23 September 2025 18:57:09 +0000 (0:00:00.335) 0:04:01.642 ***** 2025-09-23 18:58:19.008064 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:19.008070 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:19.008076 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:19.008082 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:19.008089 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:19.008095 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:19.008102 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:19.008109 | orchestrator | 2025-09-23 18:58:19.008116 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2025-09-23 18:58:19.008123 | orchestrator | Tuesday 23 September 2025 18:57:10 +0000 (0:00:00.499) 0:04:02.141 ***** 2025-09-23 18:58:19.008130 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:19.008137 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:19.008144 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:19.008151 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:19.008158 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:19.008165 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:19.008172 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:19.008178 | orchestrator | 2025-09-23 18:58:19.008185 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2025-09-23 18:58:19.008192 | orchestrator | Tuesday 23 September 2025 18:57:10 +0000 (0:00:00.335) 0:04:02.477 ***** 2025-09-23 18:58:19.008199 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:19.008232 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:19.008239 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:19.008246 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:19.008253 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:19.008260 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:19.008266 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:19.008273 | orchestrator | 2025-09-23 18:58:19.008279 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2025-09-23 18:58:19.008286 | orchestrator | Tuesday 23 September 2025 18:57:15 +0000 (0:00:05.489) 0:04:07.966 ***** 2025-09-23 18:58:19.008295 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:58:19.008305 | orchestrator | 2025-09-23 18:58:19.008312 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2025-09-23 18:58:19.008318 | orchestrator | Tuesday 23 September 2025 18:57:16 +0000 (0:00:00.463) 0:04:08.430 ***** 2025-09-23 18:58:19.008326 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008333 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2025-09-23 18:58:19.008340 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008347 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2025-09-23 18:58:19.008354 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:19.008361 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008368 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2025-09-23 18:58:19.008375 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:19.008385 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:19.008395 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008403 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2025-09-23 18:58:19.008413 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008423 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2025-09-23 18:58:19.008433 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:19.008444 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:19.008454 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008465 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2025-09-23 18:58:19.008476 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:19.008484 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2025-09-23 18:58:19.008493 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2025-09-23 18:58:19.008501 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:19.008510 | orchestrator | 2025-09-23 18:58:19.008518 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2025-09-23 18:58:19.008527 | orchestrator | Tuesday 23 September 2025 18:57:16 +0000 (0:00:00.364) 0:04:08.795 ***** 2025-09-23 18:58:19.008536 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:58:19.008547 | orchestrator | 2025-09-23 18:58:19.008557 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2025-09-23 18:58:19.008584 | orchestrator | Tuesday 23 September 2025 18:57:17 +0000 (0:00:00.419) 0:04:09.215 ***** 2025-09-23 18:58:19.008594 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2025-09-23 18:58:19.008605 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2025-09-23 18:58:19.008613 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:19.008622 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2025-09-23 18:58:19.008630 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:19.008667 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2025-09-23 18:58:19.008677 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:19.008688 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2025-09-23 18:58:19.008699 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:19.008710 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2025-09-23 18:58:19.008719 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:19.008730 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:19.008738 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2025-09-23 18:58:19.008745 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:19.008752 | orchestrator | 2025-09-23 18:58:19.008759 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2025-09-23 18:58:19.008765 | orchestrator | Tuesday 23 September 2025 18:57:17 +0000 (0:00:00.372) 0:04:09.587 ***** 2025-09-23 18:58:19.008772 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:58:19.008779 | orchestrator | 2025-09-23 18:58:19.008786 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2025-09-23 18:58:19.008793 | orchestrator | Tuesday 23 September 2025 18:57:17 +0000 (0:00:00.449) 0:04:10.036 ***** 2025-09-23 18:58:19.008799 | orchestrator | changed: [testbed-manager] 2025-09-23 18:58:19.008806 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:19.008813 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:19.008819 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:19.008826 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:19.008833 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:19.008840 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:19.008846 | orchestrator | 2025-09-23 18:58:19.008853 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2025-09-23 18:58:19.008860 | orchestrator | Tuesday 23 September 2025 18:57:53 +0000 (0:00:35.012) 0:04:45.049 ***** 2025-09-23 18:58:19.008867 | orchestrator | changed: [testbed-manager] 2025-09-23 18:58:19.008873 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:19.008880 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:19.008887 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:19.008893 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:19.008900 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:19.008907 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:19.008913 | orchestrator | 2025-09-23 18:58:19.008920 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2025-09-23 18:58:19.008927 | orchestrator | Tuesday 23 September 2025 18:58:01 +0000 (0:00:08.325) 0:04:53.374 ***** 2025-09-23 18:58:19.008934 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:19.008940 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:19.008947 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:19.008974 | orchestrator | changed: [testbed-manager] 2025-09-23 18:58:19.008982 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:19.008988 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:19.008995 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:19.009002 | orchestrator | 2025-09-23 18:58:19.009008 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2025-09-23 18:58:19.009015 | orchestrator | Tuesday 23 September 2025 18:58:08 +0000 (0:00:06.945) 0:05:00.320 ***** 2025-09-23 18:58:19.009022 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:19.009028 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:19.009035 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:19.009042 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:19.009049 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:19.009055 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:19.009062 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:19.009072 | orchestrator | 2025-09-23 18:58:19.009079 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2025-09-23 18:58:19.009086 | orchestrator | Tuesday 23 September 2025 18:58:09 +0000 (0:00:01.694) 0:05:02.015 ***** 2025-09-23 18:58:19.009092 | orchestrator | changed: [testbed-manager] 2025-09-23 18:58:19.009099 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:19.009106 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:19.009112 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:19.009119 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:19.009126 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:19.009133 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:19.009139 | orchestrator | 2025-09-23 18:58:19.009146 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2025-09-23 18:58:19.009153 | orchestrator | Tuesday 23 September 2025 18:58:16 +0000 (0:00:06.063) 0:05:08.079 ***** 2025-09-23 18:58:19.009161 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:58:19.009170 | orchestrator | 2025-09-23 18:58:19.009176 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2025-09-23 18:58:19.009183 | orchestrator | Tuesday 23 September 2025 18:58:16 +0000 (0:00:00.552) 0:05:08.631 ***** 2025-09-23 18:58:19.009190 | orchestrator | changed: [testbed-manager] 2025-09-23 18:58:19.009197 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:19.009203 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:19.009210 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:19.009217 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:19.009224 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:19.009230 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:19.009237 | orchestrator | 2025-09-23 18:58:19.009243 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2025-09-23 18:58:19.009249 | orchestrator | Tuesday 23 September 2025 18:58:17 +0000 (0:00:00.731) 0:05:09.363 ***** 2025-09-23 18:58:19.009255 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:19.009261 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:19.009267 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:19.009273 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:19.009289 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:33.566367 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:33.566480 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:33.566498 | orchestrator | 2025-09-23 18:58:33.566506 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2025-09-23 18:58:33.566513 | orchestrator | Tuesday 23 September 2025 18:58:18 +0000 (0:00:01.669) 0:05:11.032 ***** 2025-09-23 18:58:33.566519 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:33.566526 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:33.566531 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:33.566536 | orchestrator | changed: [testbed-manager] 2025-09-23 18:58:33.566542 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:33.566547 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:33.566585 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:33.566590 | orchestrator | 2025-09-23 18:58:33.566596 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2025-09-23 18:58:33.566601 | orchestrator | Tuesday 23 September 2025 18:58:19 +0000 (0:00:00.725) 0:05:11.758 ***** 2025-09-23 18:58:33.566607 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:33.566612 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:33.566617 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:33.566622 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:33.566627 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:33.566633 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:33.566638 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:33.566643 | orchestrator | 2025-09-23 18:58:33.566671 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2025-09-23 18:58:33.566676 | orchestrator | Tuesday 23 September 2025 18:58:19 +0000 (0:00:00.258) 0:05:12.016 ***** 2025-09-23 18:58:33.566681 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:33.566686 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:33.566691 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:33.566696 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:33.566701 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:33.566707 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:33.566712 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:33.566717 | orchestrator | 2025-09-23 18:58:33.566722 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2025-09-23 18:58:33.566727 | orchestrator | Tuesday 23 September 2025 18:58:20 +0000 (0:00:00.381) 0:05:12.398 ***** 2025-09-23 18:58:33.566732 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:33.566738 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:33.566743 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:33.566748 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:33.566753 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:33.566758 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:33.566762 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:33.566767 | orchestrator | 2025-09-23 18:58:33.566772 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2025-09-23 18:58:33.566777 | orchestrator | Tuesday 23 September 2025 18:58:20 +0000 (0:00:00.280) 0:05:12.678 ***** 2025-09-23 18:58:33.566783 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:33.566788 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:33.566793 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:33.566798 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:33.566803 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:33.566808 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:33.566813 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:33.566818 | orchestrator | 2025-09-23 18:58:33.566823 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2025-09-23 18:58:33.566830 | orchestrator | Tuesday 23 September 2025 18:58:20 +0000 (0:00:00.248) 0:05:12.926 ***** 2025-09-23 18:58:33.566835 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:33.566840 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:33.566845 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:33.566850 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:33.566855 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:33.566860 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:33.566865 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:33.566870 | orchestrator | 2025-09-23 18:58:33.566875 | orchestrator | TASK [osism.services.docker : Print used docker version] *********************** 2025-09-23 18:58:33.566880 | orchestrator | Tuesday 23 September 2025 18:58:21 +0000 (0:00:00.323) 0:05:13.249 ***** 2025-09-23 18:58:33.566885 | orchestrator | ok: [testbed-manager] =>  2025-09-23 18:58:33.566890 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566895 | orchestrator | ok: [testbed-node-0] =>  2025-09-23 18:58:33.566901 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566907 | orchestrator | ok: [testbed-node-1] =>  2025-09-23 18:58:33.566912 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566918 | orchestrator | ok: [testbed-node-2] =>  2025-09-23 18:58:33.566923 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566929 | orchestrator | ok: [testbed-node-3] =>  2025-09-23 18:58:33.566935 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566940 | orchestrator | ok: [testbed-node-4] =>  2025-09-23 18:58:33.566946 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566951 | orchestrator | ok: [testbed-node-5] =>  2025-09-23 18:58:33.566957 | orchestrator |  docker_version: 5:27.5.1 2025-09-23 18:58:33.566963 | orchestrator | 2025-09-23 18:58:33.566968 | orchestrator | TASK [osism.services.docker : Print used docker cli version] ******************* 2025-09-23 18:58:33.566974 | orchestrator | Tuesday 23 September 2025 18:58:21 +0000 (0:00:00.282) 0:05:13.532 ***** 2025-09-23 18:58:33.566986 | orchestrator | ok: [testbed-manager] =>  2025-09-23 18:58:33.566992 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.566998 | orchestrator | ok: [testbed-node-0] =>  2025-09-23 18:58:33.567003 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.567009 | orchestrator | ok: [testbed-node-1] =>  2025-09-23 18:58:33.567014 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.567020 | orchestrator | ok: [testbed-node-2] =>  2025-09-23 18:58:33.567025 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.567031 | orchestrator | ok: [testbed-node-3] =>  2025-09-23 18:58:33.567037 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.567042 | orchestrator | ok: [testbed-node-4] =>  2025-09-23 18:58:33.567048 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.567054 | orchestrator | ok: [testbed-node-5] =>  2025-09-23 18:58:33.567059 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-23 18:58:33.567065 | orchestrator | 2025-09-23 18:58:33.567071 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2025-09-23 18:58:33.567104 | orchestrator | Tuesday 23 September 2025 18:58:21 +0000 (0:00:00.287) 0:05:13.819 ***** 2025-09-23 18:58:33.567111 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:33.567116 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:33.567122 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:33.567127 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:33.567133 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:33.567139 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:33.567144 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:33.567150 | orchestrator | 2025-09-23 18:58:33.567156 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2025-09-23 18:58:33.567162 | orchestrator | Tuesday 23 September 2025 18:58:22 +0000 (0:00:00.299) 0:05:14.119 ***** 2025-09-23 18:58:33.567167 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:33.567173 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:33.567179 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:33.567185 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:33.567190 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:33.567196 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:33.567201 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:33.567207 | orchestrator | 2025-09-23 18:58:33.567213 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2025-09-23 18:58:33.567219 | orchestrator | Tuesday 23 September 2025 18:58:22 +0000 (0:00:00.265) 0:05:14.385 ***** 2025-09-23 18:58:33.567226 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:58:33.567234 | orchestrator | 2025-09-23 18:58:33.567240 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2025-09-23 18:58:33.567246 | orchestrator | Tuesday 23 September 2025 18:58:22 +0000 (0:00:00.421) 0:05:14.806 ***** 2025-09-23 18:58:33.567252 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:33.567257 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:33.567262 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:33.567267 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:33.567272 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:33.567277 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:33.567282 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:33.567287 | orchestrator | 2025-09-23 18:58:33.567292 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2025-09-23 18:58:33.567297 | orchestrator | Tuesday 23 September 2025 18:58:23 +0000 (0:00:00.794) 0:05:15.601 ***** 2025-09-23 18:58:33.567302 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:58:33.567307 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:58:33.567312 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:58:33.567325 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:58:33.567330 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:58:33.567335 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:33.567340 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:58:33.567345 | orchestrator | 2025-09-23 18:58:33.567351 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2025-09-23 18:58:33.567357 | orchestrator | Tuesday 23 September 2025 18:58:26 +0000 (0:00:03.252) 0:05:18.854 ***** 2025-09-23 18:58:33.567362 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2025-09-23 18:58:33.567368 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2025-09-23 18:58:33.567373 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2025-09-23 18:58:33.567378 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:58:33.567383 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2025-09-23 18:58:33.567388 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2025-09-23 18:58:33.567393 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2025-09-23 18:58:33.567398 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:58:33.567403 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2025-09-23 18:58:33.567408 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2025-09-23 18:58:33.567413 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2025-09-23 18:58:33.567418 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:58:33.567423 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2025-09-23 18:58:33.567428 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2025-09-23 18:58:33.567433 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2025-09-23 18:58:33.567438 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:58:33.567443 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2025-09-23 18:58:33.567448 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2025-09-23 18:58:33.567453 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2025-09-23 18:58:33.567458 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2025-09-23 18:58:33.567463 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2025-09-23 18:58:33.567468 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2025-09-23 18:58:33.567473 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:58:33.567478 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:58:33.567483 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2025-09-23 18:58:33.567488 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2025-09-23 18:58:33.567493 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2025-09-23 18:58:33.567498 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:58:33.567503 | orchestrator | 2025-09-23 18:58:33.567508 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2025-09-23 18:58:33.567513 | orchestrator | Tuesday 23 September 2025 18:58:27 +0000 (0:00:00.647) 0:05:19.501 ***** 2025-09-23 18:58:33.567518 | orchestrator | ok: [testbed-manager] 2025-09-23 18:58:33.567523 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:58:33.567528 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:58:33.567533 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:58:33.567538 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:58:33.567543 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:58:33.567562 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:58:33.567567 | orchestrator | 2025-09-23 18:58:33.567579 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2025-09-23 18:59:27.375392 | orchestrator | Tuesday 23 September 2025 18:58:33 +0000 (0:00:06.084) 0:05:25.585 ***** 2025-09-23 18:59:27.375530 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.375547 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.375560 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.375571 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.375607 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.375618 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.375629 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.375640 | orchestrator | 2025-09-23 18:59:27.375652 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2025-09-23 18:59:27.375663 | orchestrator | Tuesday 23 September 2025 18:58:34 +0000 (0:00:01.431) 0:05:27.017 ***** 2025-09-23 18:59:27.375673 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.375684 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.375695 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.375706 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.375717 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.375728 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.375739 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.375749 | orchestrator | 2025-09-23 18:59:27.375761 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2025-09-23 18:59:27.375771 | orchestrator | Tuesday 23 September 2025 18:58:42 +0000 (0:00:07.781) 0:05:34.799 ***** 2025-09-23 18:59:27.375782 | orchestrator | changed: [testbed-manager] 2025-09-23 18:59:27.375793 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.375804 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.375814 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.375825 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.375835 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.375846 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.375857 | orchestrator | 2025-09-23 18:59:27.375867 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2025-09-23 18:59:27.375878 | orchestrator | Tuesday 23 September 2025 18:58:45 +0000 (0:00:03.178) 0:05:37.977 ***** 2025-09-23 18:59:27.375888 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.375899 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.375910 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.375921 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.375933 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.375945 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.375957 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.375969 | orchestrator | 2025-09-23 18:59:27.375981 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2025-09-23 18:59:27.375993 | orchestrator | Tuesday 23 September 2025 18:58:47 +0000 (0:00:01.374) 0:05:39.351 ***** 2025-09-23 18:59:27.376005 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.376017 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.376028 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.376040 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.376052 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.376064 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.376076 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.376087 | orchestrator | 2025-09-23 18:59:27.376100 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2025-09-23 18:59:27.376112 | orchestrator | Tuesday 23 September 2025 18:58:48 +0000 (0:00:01.351) 0:05:40.702 ***** 2025-09-23 18:59:27.376124 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:27.376135 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:27.376147 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:27.376159 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:27.376171 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:27.376183 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:27.376195 | orchestrator | changed: [testbed-manager] 2025-09-23 18:59:27.376207 | orchestrator | 2025-09-23 18:59:27.376219 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2025-09-23 18:59:27.376231 | orchestrator | Tuesday 23 September 2025 18:58:49 +0000 (0:00:00.812) 0:05:41.515 ***** 2025-09-23 18:59:27.376243 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.376262 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.376274 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.376287 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.376299 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.376311 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.376321 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.376332 | orchestrator | 2025-09-23 18:59:27.376343 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2025-09-23 18:59:27.376354 | orchestrator | Tuesday 23 September 2025 18:58:59 +0000 (0:00:09.786) 0:05:51.302 ***** 2025-09-23 18:59:27.376364 | orchestrator | changed: [testbed-manager] 2025-09-23 18:59:27.376375 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.376386 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.376396 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.376407 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.376418 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.376428 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.376439 | orchestrator | 2025-09-23 18:59:27.376450 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2025-09-23 18:59:27.376460 | orchestrator | Tuesday 23 September 2025 18:59:00 +0000 (0:00:00.956) 0:05:52.259 ***** 2025-09-23 18:59:27.376471 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.376482 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.376492 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.376524 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.376535 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.376546 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.376556 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.376567 | orchestrator | 2025-09-23 18:59:27.376578 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2025-09-23 18:59:27.376589 | orchestrator | Tuesday 23 September 2025 18:59:09 +0000 (0:00:09.182) 0:06:01.442 ***** 2025-09-23 18:59:27.376600 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.376610 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.376621 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.376632 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.376656 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.376668 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.376698 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.376709 | orchestrator | 2025-09-23 18:59:27.376720 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2025-09-23 18:59:27.376731 | orchestrator | Tuesday 23 September 2025 18:59:20 +0000 (0:00:10.909) 0:06:12.351 ***** 2025-09-23 18:59:27.376743 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2025-09-23 18:59:27.376754 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2025-09-23 18:59:27.376765 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2025-09-23 18:59:27.376776 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2025-09-23 18:59:27.376787 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2025-09-23 18:59:27.376798 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2025-09-23 18:59:27.376884 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2025-09-23 18:59:27.376899 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2025-09-23 18:59:27.376910 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2025-09-23 18:59:27.376920 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2025-09-23 18:59:27.376931 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2025-09-23 18:59:27.376942 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2025-09-23 18:59:27.376953 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2025-09-23 18:59:27.376964 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2025-09-23 18:59:27.376974 | orchestrator | 2025-09-23 18:59:27.376985 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2025-09-23 18:59:27.377005 | orchestrator | Tuesday 23 September 2025 18:59:21 +0000 (0:00:01.196) 0:06:13.548 ***** 2025-09-23 18:59:27.377016 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:27.377027 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:27.377038 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:27.377049 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:27.377060 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:27.377071 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:27.377082 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:27.377092 | orchestrator | 2025-09-23 18:59:27.377103 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2025-09-23 18:59:27.377114 | orchestrator | Tuesday 23 September 2025 18:59:22 +0000 (0:00:00.493) 0:06:14.042 ***** 2025-09-23 18:59:27.377125 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:27.377136 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:27.377146 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:27.377157 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:27.377168 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:27.377179 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:27.377190 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:27.377200 | orchestrator | 2025-09-23 18:59:27.377211 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2025-09-23 18:59:27.377224 | orchestrator | Tuesday 23 September 2025 18:59:25 +0000 (0:00:03.662) 0:06:17.704 ***** 2025-09-23 18:59:27.377235 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:27.377246 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:27.377256 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:27.377267 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:27.377277 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:27.377288 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:27.377299 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:27.377310 | orchestrator | 2025-09-23 18:59:27.377322 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2025-09-23 18:59:27.377333 | orchestrator | Tuesday 23 September 2025 18:59:26 +0000 (0:00:00.492) 0:06:18.197 ***** 2025-09-23 18:59:27.377344 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2025-09-23 18:59:27.377355 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2025-09-23 18:59:27.377366 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:27.377377 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2025-09-23 18:59:27.377387 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2025-09-23 18:59:27.377398 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:27.377409 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2025-09-23 18:59:27.377420 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2025-09-23 18:59:27.377431 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:27.377441 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2025-09-23 18:59:27.377452 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2025-09-23 18:59:27.377463 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:27.377473 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2025-09-23 18:59:27.377484 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2025-09-23 18:59:27.377495 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:27.377524 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2025-09-23 18:59:27.377535 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2025-09-23 18:59:27.377546 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:27.377556 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2025-09-23 18:59:27.377567 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2025-09-23 18:59:27.377577 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:27.377595 | orchestrator | 2025-09-23 18:59:27.377606 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2025-09-23 18:59:27.377616 | orchestrator | Tuesday 23 September 2025 18:59:26 +0000 (0:00:00.713) 0:06:18.910 ***** 2025-09-23 18:59:27.377627 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:27.377638 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:27.377649 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:27.377659 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:27.377670 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:27.377681 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:27.377692 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:27.377703 | orchestrator | 2025-09-23 18:59:27.377721 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2025-09-23 18:59:48.549299 | orchestrator | Tuesday 23 September 2025 18:59:27 +0000 (0:00:00.491) 0:06:19.401 ***** 2025-09-23 18:59:48.549574 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:48.549598 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:48.549611 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:48.549623 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:48.549634 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:48.549645 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:48.549657 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:48.549668 | orchestrator | 2025-09-23 18:59:48.549681 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2025-09-23 18:59:48.549692 | orchestrator | Tuesday 23 September 2025 18:59:27 +0000 (0:00:00.483) 0:06:19.884 ***** 2025-09-23 18:59:48.549703 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:48.549715 | orchestrator | skipping: [testbed-node-0] 2025-09-23 18:59:48.549725 | orchestrator | skipping: [testbed-node-1] 2025-09-23 18:59:48.549736 | orchestrator | skipping: [testbed-node-2] 2025-09-23 18:59:48.549747 | orchestrator | skipping: [testbed-node-3] 2025-09-23 18:59:48.549758 | orchestrator | skipping: [testbed-node-4] 2025-09-23 18:59:48.549769 | orchestrator | skipping: [testbed-node-5] 2025-09-23 18:59:48.549780 | orchestrator | 2025-09-23 18:59:48.549791 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2025-09-23 18:59:48.549802 | orchestrator | Tuesday 23 September 2025 18:59:28 +0000 (0:00:00.516) 0:06:20.401 ***** 2025-09-23 18:59:48.549815 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.549829 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.549842 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.549854 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.549866 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:59:48.549878 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:59:48.549890 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:59:48.549902 | orchestrator | 2025-09-23 18:59:48.549915 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2025-09-23 18:59:48.549927 | orchestrator | Tuesday 23 September 2025 18:59:30 +0000 (0:00:01.637) 0:06:22.038 ***** 2025-09-23 18:59:48.549941 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:59:48.549957 | orchestrator | 2025-09-23 18:59:48.549970 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2025-09-23 18:59:48.549982 | orchestrator | Tuesday 23 September 2025 18:59:30 +0000 (0:00:00.984) 0:06:23.023 ***** 2025-09-23 18:59:48.549994 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.550006 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:48.550092 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:48.550105 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:48.550118 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:48.550130 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:48.550142 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:48.550155 | orchestrator | 2025-09-23 18:59:48.550233 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2025-09-23 18:59:48.550246 | orchestrator | Tuesday 23 September 2025 18:59:31 +0000 (0:00:00.824) 0:06:23.847 ***** 2025-09-23 18:59:48.550257 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.550267 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:48.550278 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:48.550289 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:48.550299 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:48.550310 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:48.550320 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:48.550331 | orchestrator | 2025-09-23 18:59:48.550342 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2025-09-23 18:59:48.550353 | orchestrator | Tuesday 23 September 2025 18:59:32 +0000 (0:00:00.832) 0:06:24.679 ***** 2025-09-23 18:59:48.550364 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.550374 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:48.550385 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:48.550396 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:48.550406 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:48.550417 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:48.550427 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:48.550438 | orchestrator | 2025-09-23 18:59:48.550448 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2025-09-23 18:59:48.550461 | orchestrator | Tuesday 23 September 2025 18:59:34 +0000 (0:00:01.473) 0:06:26.153 ***** 2025-09-23 18:59:48.550472 | orchestrator | skipping: [testbed-manager] 2025-09-23 18:59:48.550502 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.550514 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.550525 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.550535 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:59:48.550546 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:59:48.550557 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:59:48.550568 | orchestrator | 2025-09-23 18:59:48.550579 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2025-09-23 18:59:48.550590 | orchestrator | Tuesday 23 September 2025 18:59:35 +0000 (0:00:01.386) 0:06:27.539 ***** 2025-09-23 18:59:48.550601 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.550612 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:48.550623 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:48.550633 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:48.550644 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:48.550655 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:48.550666 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:48.550676 | orchestrator | 2025-09-23 18:59:48.550688 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2025-09-23 18:59:48.550698 | orchestrator | Tuesday 23 September 2025 18:59:36 +0000 (0:00:01.349) 0:06:28.888 ***** 2025-09-23 18:59:48.550709 | orchestrator | changed: [testbed-manager] 2025-09-23 18:59:48.550720 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:48.550730 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:48.550741 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:48.550752 | orchestrator | changed: [testbed-node-4] 2025-09-23 18:59:48.550763 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:48.550774 | orchestrator | changed: [testbed-node-5] 2025-09-23 18:59:48.550785 | orchestrator | 2025-09-23 18:59:48.550819 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2025-09-23 18:59:48.550831 | orchestrator | Tuesday 23 September 2025 18:59:38 +0000 (0:00:01.372) 0:06:30.261 ***** 2025-09-23 18:59:48.550842 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:59:48.550854 | orchestrator | 2025-09-23 18:59:48.550865 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2025-09-23 18:59:48.550885 | orchestrator | Tuesday 23 September 2025 18:59:39 +0000 (0:00:00.995) 0:06:31.256 ***** 2025-09-23 18:59:48.550896 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.550907 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.550918 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.550929 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.550940 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:59:48.550951 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:59:48.550961 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:59:48.550972 | orchestrator | 2025-09-23 18:59:48.550983 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2025-09-23 18:59:48.550994 | orchestrator | Tuesday 23 September 2025 18:59:40 +0000 (0:00:01.432) 0:06:32.689 ***** 2025-09-23 18:59:48.551004 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.551015 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.551025 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.551036 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.551047 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:59:48.551057 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:59:48.551068 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:59:48.551078 | orchestrator | 2025-09-23 18:59:48.551089 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2025-09-23 18:59:48.551100 | orchestrator | Tuesday 23 September 2025 18:59:41 +0000 (0:00:01.084) 0:06:33.774 ***** 2025-09-23 18:59:48.551111 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.551122 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.551132 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.551143 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.551154 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:59:48.551165 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:59:48.551175 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:59:48.551186 | orchestrator | 2025-09-23 18:59:48.551197 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2025-09-23 18:59:48.551208 | orchestrator | Tuesday 23 September 2025 18:59:42 +0000 (0:00:01.174) 0:06:34.948 ***** 2025-09-23 18:59:48.551218 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.551229 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.551239 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.551250 | orchestrator | ok: [testbed-node-3] 2025-09-23 18:59:48.551261 | orchestrator | ok: [testbed-node-4] 2025-09-23 18:59:48.551271 | orchestrator | ok: [testbed-node-5] 2025-09-23 18:59:48.551282 | orchestrator | ok: [testbed-manager] 2025-09-23 18:59:48.551293 | orchestrator | 2025-09-23 18:59:48.551304 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2025-09-23 18:59:48.551315 | orchestrator | Tuesday 23 September 2025 18:59:44 +0000 (0:00:01.655) 0:06:36.604 ***** 2025-09-23 18:59:48.551326 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 18:59:48.551337 | orchestrator | 2025-09-23 18:59:48.551348 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551358 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:01.040) 0:06:37.644 ***** 2025-09-23 18:59:48.551369 | orchestrator | 2025-09-23 18:59:48.551379 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551390 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.040) 0:06:37.685 ***** 2025-09-23 18:59:48.551401 | orchestrator | 2025-09-23 18:59:48.551411 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551422 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.047) 0:06:37.732 ***** 2025-09-23 18:59:48.551432 | orchestrator | 2025-09-23 18:59:48.551443 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551454 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.038) 0:06:37.770 ***** 2025-09-23 18:59:48.551509 | orchestrator | 2025-09-23 18:59:48.551521 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551532 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.038) 0:06:37.809 ***** 2025-09-23 18:59:48.551543 | orchestrator | 2025-09-23 18:59:48.551553 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551564 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.046) 0:06:37.856 ***** 2025-09-23 18:59:48.551575 | orchestrator | 2025-09-23 18:59:48.551585 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-23 18:59:48.551596 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.038) 0:06:37.895 ***** 2025-09-23 18:59:48.551607 | orchestrator | 2025-09-23 18:59:48.551617 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-09-23 18:59:48.551628 | orchestrator | Tuesday 23 September 2025 18:59:45 +0000 (0:00:00.039) 0:06:37.934 ***** 2025-09-23 18:59:48.551639 | orchestrator | ok: [testbed-node-0] 2025-09-23 18:59:48.551649 | orchestrator | ok: [testbed-node-1] 2025-09-23 18:59:48.551660 | orchestrator | ok: [testbed-node-2] 2025-09-23 18:59:48.551671 | orchestrator | 2025-09-23 18:59:48.551682 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2025-09-23 18:59:48.551692 | orchestrator | Tuesday 23 September 2025 18:59:47 +0000 (0:00:01.159) 0:06:39.093 ***** 2025-09-23 18:59:48.551703 | orchestrator | changed: [testbed-manager] 2025-09-23 18:59:48.551714 | orchestrator | changed: [testbed-node-0] 2025-09-23 18:59:48.551731 | orchestrator | changed: [testbed-node-1] 2025-09-23 18:59:48.551743 | orchestrator | changed: [testbed-node-2] 2025-09-23 18:59:48.551753 | orchestrator | changed: [testbed-node-3] 2025-09-23 18:59:48.551771 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.699713 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.699831 | orchestrator | 2025-09-23 19:00:15.699848 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2025-09-23 19:00:15.699861 | orchestrator | Tuesday 23 September 2025 18:59:48 +0000 (0:00:01.484) 0:06:40.578 ***** 2025-09-23 19:00:15.699872 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:15.699883 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:15.699894 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:15.699905 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.699917 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:15.699927 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.699938 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:15.699950 | orchestrator | 2025-09-23 19:00:15.699961 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2025-09-23 19:00:15.699972 | orchestrator | Tuesday 23 September 2025 18:59:51 +0000 (0:00:02.516) 0:06:43.094 ***** 2025-09-23 19:00:15.699983 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:15.699993 | orchestrator | 2025-09-23 19:00:15.700005 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2025-09-23 19:00:15.700016 | orchestrator | Tuesday 23 September 2025 18:59:51 +0000 (0:00:00.113) 0:06:43.207 ***** 2025-09-23 19:00:15.700027 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.700039 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:15.700050 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:15.700060 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:15.700071 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:15.700082 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.700092 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.700103 | orchestrator | 2025-09-23 19:00:15.700114 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2025-09-23 19:00:15.700126 | orchestrator | Tuesday 23 September 2025 18:59:52 +0000 (0:00:00.979) 0:06:44.187 ***** 2025-09-23 19:00:15.700136 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:15.700147 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:15.700181 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:15.700192 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:15.700203 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:15.700213 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:15.700224 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:15.700235 | orchestrator | 2025-09-23 19:00:15.700246 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2025-09-23 19:00:15.700256 | orchestrator | Tuesday 23 September 2025 18:59:52 +0000 (0:00:00.530) 0:06:44.718 ***** 2025-09-23 19:00:15.700268 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:00:15.700281 | orchestrator | 2025-09-23 19:00:15.700292 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2025-09-23 19:00:15.700303 | orchestrator | Tuesday 23 September 2025 18:59:53 +0000 (0:00:01.068) 0:06:45.786 ***** 2025-09-23 19:00:15.700314 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.700324 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:15.700335 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:15.700346 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:15.700357 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:15.700368 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:15.700378 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:15.700389 | orchestrator | 2025-09-23 19:00:15.700400 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2025-09-23 19:00:15.700411 | orchestrator | Tuesday 23 September 2025 18:59:54 +0000 (0:00:00.841) 0:06:46.627 ***** 2025-09-23 19:00:15.700421 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2025-09-23 19:00:15.700432 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2025-09-23 19:00:15.700443 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2025-09-23 19:00:15.700477 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2025-09-23 19:00:15.700489 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2025-09-23 19:00:15.700500 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2025-09-23 19:00:15.700511 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2025-09-23 19:00:15.700522 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2025-09-23 19:00:15.700533 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2025-09-23 19:00:15.700543 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2025-09-23 19:00:15.700554 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2025-09-23 19:00:15.700565 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2025-09-23 19:00:15.700576 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2025-09-23 19:00:15.700587 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2025-09-23 19:00:15.700597 | orchestrator | 2025-09-23 19:00:15.700608 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2025-09-23 19:00:15.700619 | orchestrator | Tuesday 23 September 2025 18:59:57 +0000 (0:00:02.500) 0:06:49.128 ***** 2025-09-23 19:00:15.700630 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:15.700641 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:15.700652 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:15.700662 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:15.700673 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:15.700684 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:15.700695 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:15.700706 | orchestrator | 2025-09-23 19:00:15.700717 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2025-09-23 19:00:15.700728 | orchestrator | Tuesday 23 September 2025 18:59:57 +0000 (0:00:00.484) 0:06:49.612 ***** 2025-09-23 19:00:15.700775 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:00:15.700798 | orchestrator | 2025-09-23 19:00:15.700809 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2025-09-23 19:00:15.700820 | orchestrator | Tuesday 23 September 2025 18:59:58 +0000 (0:00:00.954) 0:06:50.567 ***** 2025-09-23 19:00:15.700831 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.700842 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:15.700853 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:15.700863 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:15.700874 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:15.700885 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:15.700896 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:15.700907 | orchestrator | 2025-09-23 19:00:15.700917 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2025-09-23 19:00:15.700928 | orchestrator | Tuesday 23 September 2025 18:59:59 +0000 (0:00:00.847) 0:06:51.414 ***** 2025-09-23 19:00:15.700939 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.700950 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:15.700961 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:15.700971 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:15.700982 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:15.700993 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:15.701003 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:15.701014 | orchestrator | 2025-09-23 19:00:15.701025 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2025-09-23 19:00:15.701036 | orchestrator | Tuesday 23 September 2025 19:00:00 +0000 (0:00:00.815) 0:06:52.229 ***** 2025-09-23 19:00:15.701047 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:15.701058 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:15.701069 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:15.701079 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:15.701090 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:15.701101 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:15.701111 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:15.701122 | orchestrator | 2025-09-23 19:00:15.701133 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2025-09-23 19:00:15.701144 | orchestrator | Tuesday 23 September 2025 19:00:00 +0000 (0:00:00.486) 0:06:52.716 ***** 2025-09-23 19:00:15.701155 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.701165 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:15.701176 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:15.701187 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:15.701198 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:15.701208 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:15.701219 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:15.701230 | orchestrator | 2025-09-23 19:00:15.701241 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2025-09-23 19:00:15.701252 | orchestrator | Tuesday 23 September 2025 19:00:02 +0000 (0:00:01.716) 0:06:54.433 ***** 2025-09-23 19:00:15.701263 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:15.701273 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:15.701284 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:15.701295 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:15.701306 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:15.701317 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:15.701327 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:15.701338 | orchestrator | 2025-09-23 19:00:15.701349 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2025-09-23 19:00:15.701360 | orchestrator | Tuesday 23 September 2025 19:00:02 +0000 (0:00:00.507) 0:06:54.941 ***** 2025-09-23 19:00:15.701370 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.701381 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:15.701392 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:15.701409 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:15.701420 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.701431 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:15.701442 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.701452 | orchestrator | 2025-09-23 19:00:15.701479 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2025-09-23 19:00:15.701490 | orchestrator | Tuesday 23 September 2025 19:00:10 +0000 (0:00:07.337) 0:07:02.278 ***** 2025-09-23 19:00:15.701501 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.701512 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:15.701523 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:15.701534 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:15.701545 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.701555 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:15.701566 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.701577 | orchestrator | 2025-09-23 19:00:15.701588 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2025-09-23 19:00:15.701598 | orchestrator | Tuesday 23 September 2025 19:00:11 +0000 (0:00:01.304) 0:07:03.583 ***** 2025-09-23 19:00:15.701609 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.701620 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:15.701631 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:15.701641 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:15.701652 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:15.701663 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.701673 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.701684 | orchestrator | 2025-09-23 19:00:15.701695 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2025-09-23 19:00:15.701706 | orchestrator | Tuesday 23 September 2025 19:00:13 +0000 (0:00:01.796) 0:07:05.379 ***** 2025-09-23 19:00:15.701717 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.701728 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:15.701739 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:15.701750 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:15.701760 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:15.701771 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:15.701782 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:15.701792 | orchestrator | 2025-09-23 19:00:15.701803 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-09-23 19:00:15.701820 | orchestrator | Tuesday 23 September 2025 19:00:14 +0000 (0:00:01.523) 0:07:06.903 ***** 2025-09-23 19:00:15.701831 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:15.701842 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:15.701852 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:15.701863 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:15.701881 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.156845 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.156950 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.156964 | orchestrator | 2025-09-23 19:00:46.156976 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-09-23 19:00:46.156986 | orchestrator | Tuesday 23 September 2025 19:00:15 +0000 (0:00:00.821) 0:07:07.724 ***** 2025-09-23 19:00:46.156996 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:46.157005 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:46.157014 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:46.157023 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:46.157032 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:46.157041 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:46.157050 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:46.157059 | orchestrator | 2025-09-23 19:00:46.157068 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2025-09-23 19:00:46.157077 | orchestrator | Tuesday 23 September 2025 19:00:16 +0000 (0:00:00.814) 0:07:08.539 ***** 2025-09-23 19:00:46.157086 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:46.157116 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:46.157125 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:46.157134 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:46.157143 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:46.157151 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:46.157160 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:46.157169 | orchestrator | 2025-09-23 19:00:46.157178 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2025-09-23 19:00:46.157186 | orchestrator | Tuesday 23 September 2025 19:00:16 +0000 (0:00:00.450) 0:07:08.989 ***** 2025-09-23 19:00:46.157195 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157204 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157213 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157221 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157230 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157239 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157247 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157256 | orchestrator | 2025-09-23 19:00:46.157265 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2025-09-23 19:00:46.157274 | orchestrator | Tuesday 23 September 2025 19:00:17 +0000 (0:00:00.449) 0:07:09.439 ***** 2025-09-23 19:00:46.157283 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157292 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157301 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157309 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157318 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157326 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157335 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157343 | orchestrator | 2025-09-23 19:00:46.157352 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2025-09-23 19:00:46.157360 | orchestrator | Tuesday 23 September 2025 19:00:17 +0000 (0:00:00.445) 0:07:09.885 ***** 2025-09-23 19:00:46.157369 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157378 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157386 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157396 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157406 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157415 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157425 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157435 | orchestrator | 2025-09-23 19:00:46.157467 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2025-09-23 19:00:46.157477 | orchestrator | Tuesday 23 September 2025 19:00:18 +0000 (0:00:00.508) 0:07:10.394 ***** 2025-09-23 19:00:46.157487 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157497 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157507 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157517 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157526 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157536 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157545 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157555 | orchestrator | 2025-09-23 19:00:46.157565 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2025-09-23 19:00:46.157574 | orchestrator | Tuesday 23 September 2025 19:00:23 +0000 (0:00:05.443) 0:07:15.837 ***** 2025-09-23 19:00:46.157583 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:46.157592 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:46.157600 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:46.157609 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:46.157618 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:46.157627 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:46.157636 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:00:46.157644 | orchestrator | 2025-09-23 19:00:46.157653 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2025-09-23 19:00:46.157662 | orchestrator | Tuesday 23 September 2025 19:00:24 +0000 (0:00:00.533) 0:07:16.371 ***** 2025-09-23 19:00:46.157680 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:00:46.157691 | orchestrator | 2025-09-23 19:00:46.157700 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2025-09-23 19:00:46.157709 | orchestrator | Tuesday 23 September 2025 19:00:25 +0000 (0:00:00.778) 0:07:17.150 ***** 2025-09-23 19:00:46.157718 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157726 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157735 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157744 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157753 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157762 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157770 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157779 | orchestrator | 2025-09-23 19:00:46.157788 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2025-09-23 19:00:46.157796 | orchestrator | Tuesday 23 September 2025 19:00:27 +0000 (0:00:02.091) 0:07:19.241 ***** 2025-09-23 19:00:46.157805 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157814 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157822 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157831 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157840 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157848 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157857 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157866 | orchestrator | 2025-09-23 19:00:46.157898 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2025-09-23 19:00:46.157908 | orchestrator | Tuesday 23 September 2025 19:00:28 +0000 (0:00:01.699) 0:07:20.941 ***** 2025-09-23 19:00:46.157917 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.157925 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.157934 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.157943 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.157952 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.157960 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.157969 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.157978 | orchestrator | 2025-09-23 19:00:46.157986 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2025-09-23 19:00:46.157995 | orchestrator | Tuesday 23 September 2025 19:00:29 +0000 (0:00:00.874) 0:07:21.815 ***** 2025-09-23 19:00:46.158057 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158070 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158078 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158087 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158096 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158104 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158113 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-23 19:00:46.158122 | orchestrator | 2025-09-23 19:00:46.158131 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2025-09-23 19:00:46.158139 | orchestrator | Tuesday 23 September 2025 19:00:31 +0000 (0:00:01.670) 0:07:23.486 ***** 2025-09-23 19:00:46.158156 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:00:46.158165 | orchestrator | 2025-09-23 19:00:46.158173 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2025-09-23 19:00:46.158182 | orchestrator | Tuesday 23 September 2025 19:00:32 +0000 (0:00:00.922) 0:07:24.408 ***** 2025-09-23 19:00:46.158191 | orchestrator | changed: [testbed-manager] 2025-09-23 19:00:46.158200 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:46.158208 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:46.158217 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:46.158226 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:46.158234 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:46.158243 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:46.158252 | orchestrator | 2025-09-23 19:00:46.158260 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2025-09-23 19:00:46.158269 | orchestrator | Tuesday 23 September 2025 19:00:41 +0000 (0:00:08.777) 0:07:33.186 ***** 2025-09-23 19:00:46.158278 | orchestrator | ok: [testbed-manager] 2025-09-23 19:00:46.158286 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.158295 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.158303 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.158312 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.158321 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.158329 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.158338 | orchestrator | 2025-09-23 19:00:46.158346 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2025-09-23 19:00:46.158355 | orchestrator | Tuesday 23 September 2025 19:00:43 +0000 (0:00:01.926) 0:07:35.112 ***** 2025-09-23 19:00:46.158364 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:00:46.158372 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:00:46.158381 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:00:46.158389 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:00:46.158398 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:00:46.158406 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:00:46.158415 | orchestrator | 2025-09-23 19:00:46.158423 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2025-09-23 19:00:46.158432 | orchestrator | Tuesday 23 September 2025 19:00:44 +0000 (0:00:01.281) 0:07:36.394 ***** 2025-09-23 19:00:46.158474 | orchestrator | changed: [testbed-manager] 2025-09-23 19:00:46.158484 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:00:46.158493 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:00:46.158502 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:00:46.158510 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:00:46.158519 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:00:46.158527 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:00:46.158536 | orchestrator | 2025-09-23 19:00:46.158545 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2025-09-23 19:00:46.158553 | orchestrator | 2025-09-23 19:00:46.158562 | orchestrator | TASK [Include hardening role] ************************************************** 2025-09-23 19:00:46.158571 | orchestrator | Tuesday 23 September 2025 19:00:45 +0000 (0:00:01.264) 0:07:37.658 ***** 2025-09-23 19:00:46.158579 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:00:46.158588 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:00:46.158601 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:00:46.158610 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:00:46.158618 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:00:46.158627 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:00:46.158642 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:01:12.100921 | orchestrator | 2025-09-23 19:01:12.101014 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2025-09-23 19:01:12.101030 | orchestrator | 2025-09-23 19:01:12.101042 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2025-09-23 19:01:12.101073 | orchestrator | Tuesday 23 September 2025 19:00:46 +0000 (0:00:00.528) 0:07:38.187 ***** 2025-09-23 19:01:12.101085 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.101096 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.101106 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.101117 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.101128 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.101138 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.101149 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.101159 | orchestrator | 2025-09-23 19:01:12.101170 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2025-09-23 19:01:12.101181 | orchestrator | Tuesday 23 September 2025 19:00:47 +0000 (0:00:01.521) 0:07:39.708 ***** 2025-09-23 19:01:12.101192 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:12.101203 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:12.101214 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:12.101225 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:12.101235 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:12.101246 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:12.101256 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:12.101267 | orchestrator | 2025-09-23 19:01:12.101277 | orchestrator | TASK [Include auditd role] ***************************************************** 2025-09-23 19:01:12.101288 | orchestrator | Tuesday 23 September 2025 19:00:49 +0000 (0:00:01.418) 0:07:41.126 ***** 2025-09-23 19:01:12.101299 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:01:12.101310 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:01:12.101320 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:01:12.101331 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:01:12.101341 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:01:12.101352 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:01:12.101362 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:01:12.101373 | orchestrator | 2025-09-23 19:01:12.101384 | orchestrator | TASK [Include smartd role] ***************************************************** 2025-09-23 19:01:12.101395 | orchestrator | Tuesday 23 September 2025 19:00:49 +0000 (0:00:00.492) 0:07:41.619 ***** 2025-09-23 19:01:12.101406 | orchestrator | included: osism.services.smartd for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:01:12.101418 | orchestrator | 2025-09-23 19:01:12.101470 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2025-09-23 19:01:12.101483 | orchestrator | Tuesday 23 September 2025 19:00:50 +0000 (0:00:00.951) 0:07:42.570 ***** 2025-09-23 19:01:12.101497 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:01:12.101512 | orchestrator | 2025-09-23 19:01:12.101524 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2025-09-23 19:01:12.101537 | orchestrator | Tuesday 23 September 2025 19:00:51 +0000 (0:00:00.813) 0:07:43.383 ***** 2025-09-23 19:01:12.101549 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.101562 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.101574 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.101587 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.101599 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.101611 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.101624 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.101636 | orchestrator | 2025-09-23 19:01:12.101649 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2025-09-23 19:01:12.101661 | orchestrator | Tuesday 23 September 2025 19:00:59 +0000 (0:00:08.149) 0:07:51.533 ***** 2025-09-23 19:01:12.101674 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.101686 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.101706 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.101719 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.101731 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.101744 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.101756 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.101768 | orchestrator | 2025-09-23 19:01:12.101780 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2025-09-23 19:01:12.101792 | orchestrator | Tuesday 23 September 2025 19:01:00 +0000 (0:00:00.837) 0:07:52.370 ***** 2025-09-23 19:01:12.101805 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.101817 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.101829 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.101840 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.101850 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.101861 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.101872 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.101882 | orchestrator | 2025-09-23 19:01:12.101893 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2025-09-23 19:01:12.101904 | orchestrator | Tuesday 23 September 2025 19:01:01 +0000 (0:00:01.522) 0:07:53.893 ***** 2025-09-23 19:01:12.101915 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.101926 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.101936 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.101947 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.101958 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.101968 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.101979 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.101989 | orchestrator | 2025-09-23 19:01:12.102000 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2025-09-23 19:01:12.102011 | orchestrator | Tuesday 23 September 2025 19:01:03 +0000 (0:00:01.682) 0:07:55.576 ***** 2025-09-23 19:01:12.102088 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.102102 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.102113 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.102123 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.102151 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.102163 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.102173 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.102184 | orchestrator | 2025-09-23 19:01:12.102195 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2025-09-23 19:01:12.102206 | orchestrator | Tuesday 23 September 2025 19:01:04 +0000 (0:00:01.167) 0:07:56.743 ***** 2025-09-23 19:01:12.102216 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.102227 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.102237 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.102248 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.102259 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.102270 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.102280 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.102291 | orchestrator | 2025-09-23 19:01:12.102302 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2025-09-23 19:01:12.102313 | orchestrator | 2025-09-23 19:01:12.102323 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2025-09-23 19:01:12.102334 | orchestrator | Tuesday 23 September 2025 19:01:06 +0000 (0:00:01.318) 0:07:58.062 ***** 2025-09-23 19:01:12.102345 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:01:12.102356 | orchestrator | 2025-09-23 19:01:12.102367 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-09-23 19:01:12.102377 | orchestrator | Tuesday 23 September 2025 19:01:06 +0000 (0:00:00.766) 0:07:58.829 ***** 2025-09-23 19:01:12.102388 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:12.102399 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:12.102417 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:12.102446 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:12.102458 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:12.102469 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:12.102479 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:12.102490 | orchestrator | 2025-09-23 19:01:12.102501 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-09-23 19:01:12.102511 | orchestrator | Tuesday 23 September 2025 19:01:07 +0000 (0:00:00.821) 0:07:59.651 ***** 2025-09-23 19:01:12.102522 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.102533 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.102544 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.102555 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.102565 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.102576 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.102587 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.102597 | orchestrator | 2025-09-23 19:01:12.102608 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2025-09-23 19:01:12.102619 | orchestrator | Tuesday 23 September 2025 19:01:08 +0000 (0:00:01.347) 0:08:00.998 ***** 2025-09-23 19:01:12.102630 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:01:12.102641 | orchestrator | 2025-09-23 19:01:12.102652 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-09-23 19:01:12.102663 | orchestrator | Tuesday 23 September 2025 19:01:09 +0000 (0:00:00.845) 0:08:01.844 ***** 2025-09-23 19:01:12.102674 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:12.102684 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:12.102695 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:12.102706 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:12.102716 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:12.102727 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:12.102738 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:12.102748 | orchestrator | 2025-09-23 19:01:12.102759 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-09-23 19:01:12.102770 | orchestrator | Tuesday 23 September 2025 19:01:10 +0000 (0:00:00.907) 0:08:02.752 ***** 2025-09-23 19:01:12.102781 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:12.102792 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:12.102802 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:12.102813 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:12.102824 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:12.102834 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:12.102845 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:12.102856 | orchestrator | 2025-09-23 19:01:12.102867 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:01:12.102878 | orchestrator | testbed-manager : ok=164  changed=38  unreachable=0 failed=0 skipped=42  rescued=0 ignored=0 2025-09-23 19:01:12.102890 | orchestrator | testbed-node-0 : ok=173  changed=67  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-09-23 19:01:12.102901 | orchestrator | testbed-node-1 : ok=173  changed=67  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-09-23 19:01:12.102912 | orchestrator | testbed-node-2 : ok=173  changed=67  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-09-23 19:01:12.102922 | orchestrator | testbed-node-3 : ok=171  changed=63  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-09-23 19:01:12.102933 | orchestrator | testbed-node-4 : ok=171  changed=63  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-09-23 19:01:12.102955 | orchestrator | testbed-node-5 : ok=171  changed=63  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-09-23 19:01:12.102966 | orchestrator | 2025-09-23 19:01:12.102977 | orchestrator | 2025-09-23 19:01:12.102995 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:01:12.355769 | orchestrator | Tuesday 23 September 2025 19:01:12 +0000 (0:00:01.366) 0:08:04.118 ***** 2025-09-23 19:01:12.355850 | orchestrator | =============================================================================== 2025-09-23 19:01:12.355863 | orchestrator | osism.commons.packages : Install required packages --------------------- 75.61s 2025-09-23 19:01:12.355873 | orchestrator | osism.commons.packages : Download required packages -------------------- 39.49s 2025-09-23 19:01:12.355883 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 35.01s 2025-09-23 19:01:12.355893 | orchestrator | osism.commons.repository : Update package cache ------------------------ 18.45s 2025-09-23 19:01:12.355903 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 12.02s 2025-09-23 19:01:12.355912 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 11.89s 2025-09-23 19:01:12.355922 | orchestrator | osism.services.docker : Install docker package ------------------------- 10.91s 2025-09-23 19:01:12.355932 | orchestrator | osism.services.docker : Install containerd package ---------------------- 9.79s 2025-09-23 19:01:12.355942 | orchestrator | osism.services.docker : Install docker-cli package ---------------------- 9.18s 2025-09-23 19:01:12.355951 | orchestrator | osism.services.lldpd : Install lldpd package ---------------------------- 8.78s 2025-09-23 19:01:12.355961 | orchestrator | osism.commons.cleanup : Remove cloudinit package ------------------------ 8.33s 2025-09-23 19:01:12.355970 | orchestrator | osism.services.smartd : Install smartmontools package ------------------- 8.15s 2025-09-23 19:01:12.355980 | orchestrator | osism.services.rng : Install rng package -------------------------------- 8.04s 2025-09-23 19:01:12.355990 | orchestrator | osism.services.docker : Add repository ---------------------------------- 7.78s 2025-09-23 19:01:12.355999 | orchestrator | osism.commons.docker_compose : Install docker-compose-plugin package ---- 7.34s 2025-09-23 19:01:12.356009 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 6.95s 2025-09-23 19:01:12.356018 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 6.08s 2025-09-23 19:01:12.356028 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 6.06s 2025-09-23 19:01:12.356038 | orchestrator | osism.commons.services : Populate service facts ------------------------- 5.50s 2025-09-23 19:01:12.356047 | orchestrator | osism.commons.cleanup : Populate service facts -------------------------- 5.49s 2025-09-23 19:01:12.541198 | orchestrator | + [[ -e /etc/redhat-release ]] 2025-09-23 19:01:12.541296 | orchestrator | + osism apply network 2025-09-23 19:01:24.785101 | orchestrator | 2025-09-23 19:01:24 | INFO  | Task a5ed14ed-58b0-44b5-8e98-88abc57fd844 (network) was prepared for execution. 2025-09-23 19:01:24.785212 | orchestrator | 2025-09-23 19:01:24 | INFO  | It takes a moment until task a5ed14ed-58b0-44b5-8e98-88abc57fd844 (network) has been started and output is visible here. 2025-09-23 19:01:54.185550 | orchestrator | 2025-09-23 19:01:54.185661 | orchestrator | PLAY [Apply role network] ****************************************************** 2025-09-23 19:01:54.185676 | orchestrator | 2025-09-23 19:01:54.185686 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2025-09-23 19:01:54.185697 | orchestrator | Tuesday 23 September 2025 19:01:29 +0000 (0:00:00.296) 0:00:00.296 ***** 2025-09-23 19:01:54.185707 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.185718 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.185729 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.185738 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.185748 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.185758 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.185767 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.185801 | orchestrator | 2025-09-23 19:01:54.185812 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2025-09-23 19:01:54.185822 | orchestrator | Tuesday 23 September 2025 19:01:29 +0000 (0:00:00.679) 0:00:00.976 ***** 2025-09-23 19:01:54.185832 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:01:54.185845 | orchestrator | 2025-09-23 19:01:54.185855 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2025-09-23 19:01:54.185864 | orchestrator | Tuesday 23 September 2025 19:01:30 +0000 (0:00:01.189) 0:00:02.165 ***** 2025-09-23 19:01:54.185874 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.185883 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.185893 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.185902 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.185912 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.185921 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.185931 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.185940 | orchestrator | 2025-09-23 19:01:54.185950 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2025-09-23 19:01:54.185959 | orchestrator | Tuesday 23 September 2025 19:01:33 +0000 (0:00:02.058) 0:00:04.223 ***** 2025-09-23 19:01:54.185969 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.185979 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.185988 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.185997 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.186007 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.186071 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.186082 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.186093 | orchestrator | 2025-09-23 19:01:54.186118 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2025-09-23 19:01:54.186130 | orchestrator | Tuesday 23 September 2025 19:01:34 +0000 (0:00:01.792) 0:00:06.016 ***** 2025-09-23 19:01:54.186141 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2025-09-23 19:01:54.186152 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2025-09-23 19:01:54.186163 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2025-09-23 19:01:54.186174 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2025-09-23 19:01:54.186185 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2025-09-23 19:01:54.186196 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2025-09-23 19:01:54.186208 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2025-09-23 19:01:54.186220 | orchestrator | 2025-09-23 19:01:54.186231 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2025-09-23 19:01:54.186242 | orchestrator | Tuesday 23 September 2025 19:01:35 +0000 (0:00:00.964) 0:00:06.980 ***** 2025-09-23 19:01:54.186254 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:01:54.186265 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-09-23 19:01:54.186276 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-09-23 19:01:54.186287 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-23 19:01:54.186298 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-09-23 19:01:54.186309 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-23 19:01:54.186320 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-23 19:01:54.186331 | orchestrator | 2025-09-23 19:01:54.186341 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2025-09-23 19:01:54.186352 | orchestrator | Tuesday 23 September 2025 19:01:38 +0000 (0:00:03.119) 0:00:10.100 ***** 2025-09-23 19:01:54.186363 | orchestrator | changed: [testbed-manager] 2025-09-23 19:01:54.186374 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:54.186385 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:54.186396 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:54.186442 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:54.186462 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:54.186472 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:54.186481 | orchestrator | 2025-09-23 19:01:54.186491 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2025-09-23 19:01:54.186501 | orchestrator | Tuesday 23 September 2025 19:01:40 +0000 (0:00:01.388) 0:00:11.488 ***** 2025-09-23 19:01:54.186510 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-23 19:01:54.186520 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:01:54.186529 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-23 19:01:54.186538 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-23 19:01:54.186548 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-09-23 19:01:54.186557 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-09-23 19:01:54.186567 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-09-23 19:01:54.186576 | orchestrator | 2025-09-23 19:01:54.186586 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2025-09-23 19:01:54.186595 | orchestrator | Tuesday 23 September 2025 19:01:42 +0000 (0:00:02.023) 0:00:13.512 ***** 2025-09-23 19:01:54.186605 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.186614 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.186624 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.186633 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.186642 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.186652 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.186661 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.186671 | orchestrator | 2025-09-23 19:01:54.186680 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2025-09-23 19:01:54.186707 | orchestrator | Tuesday 23 September 2025 19:01:43 +0000 (0:00:01.142) 0:00:14.654 ***** 2025-09-23 19:01:54.186718 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:01:54.186727 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:01:54.186737 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:01:54.186746 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:01:54.186756 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:01:54.186766 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:01:54.186780 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:01:54.186797 | orchestrator | 2025-09-23 19:01:54.186813 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2025-09-23 19:01:54.186829 | orchestrator | Tuesday 23 September 2025 19:01:44 +0000 (0:00:00.678) 0:00:15.333 ***** 2025-09-23 19:01:54.186845 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.186861 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.186876 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.186892 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.186908 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.186925 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.186937 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.186946 | orchestrator | 2025-09-23 19:01:54.186956 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2025-09-23 19:01:54.186966 | orchestrator | Tuesday 23 September 2025 19:01:46 +0000 (0:00:02.232) 0:00:17.566 ***** 2025-09-23 19:01:54.186975 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:01:54.186985 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:01:54.186995 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:01:54.187004 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:01:54.187013 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:01:54.187023 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:01:54.187033 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/iptables.sh', 'src': '/opt/configuration/network/iptables.sh'}) 2025-09-23 19:01:54.187044 | orchestrator | 2025-09-23 19:01:54.187054 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2025-09-23 19:01:54.187064 | orchestrator | Tuesday 23 September 2025 19:01:47 +0000 (0:00:00.896) 0:00:18.462 ***** 2025-09-23 19:01:54.187073 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.187091 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:01:54.187100 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:01:54.187110 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:01:54.187119 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:01:54.187129 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:01:54.187138 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:01:54.187148 | orchestrator | 2025-09-23 19:01:54.187158 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2025-09-23 19:01:54.187167 | orchestrator | Tuesday 23 September 2025 19:01:48 +0000 (0:00:01.598) 0:00:20.061 ***** 2025-09-23 19:01:54.187178 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:01:54.187189 | orchestrator | 2025-09-23 19:01:54.187199 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2025-09-23 19:01:54.187209 | orchestrator | Tuesday 23 September 2025 19:01:50 +0000 (0:00:01.227) 0:00:21.288 ***** 2025-09-23 19:01:54.187219 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.187228 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.187238 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.187247 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.187257 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.187266 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.187276 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.187285 | orchestrator | 2025-09-23 19:01:54.187295 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2025-09-23 19:01:54.187305 | orchestrator | Tuesday 23 September 2025 19:01:52 +0000 (0:00:01.974) 0:00:23.263 ***** 2025-09-23 19:01:54.187314 | orchestrator | ok: [testbed-manager] 2025-09-23 19:01:54.187323 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:01:54.187333 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:01:54.187342 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:01:54.187352 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:01:54.187361 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:01:54.187371 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:01:54.187380 | orchestrator | 2025-09-23 19:01:54.187390 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2025-09-23 19:01:54.187399 | orchestrator | Tuesday 23 September 2025 19:01:52 +0000 (0:00:00.874) 0:00:24.138 ***** 2025-09-23 19:01:54.187440 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187457 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187474 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187490 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187506 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187516 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187525 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187535 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187544 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187554 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2025-09-23 19:01:54.187564 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187573 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187583 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187593 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-23 19:01:54.187602 | orchestrator | 2025-09-23 19:01:54.187621 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2025-09-23 19:02:09.962111 | orchestrator | Tuesday 23 September 2025 19:01:54 +0000 (0:00:01.245) 0:00:25.383 ***** 2025-09-23 19:02:09.962208 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:02:09.962224 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:02:09.962235 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:02:09.962247 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:02:09.962258 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:02:09.962269 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:02:09.962280 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:02:09.962291 | orchestrator | 2025-09-23 19:02:09.962303 | orchestrator | TASK [osism.commons.network : Include vxlan interfaces] ************************ 2025-09-23 19:02:09.962315 | orchestrator | Tuesday 23 September 2025 19:01:54 +0000 (0:00:00.671) 0:00:26.055 ***** 2025-09-23 19:02:09.962327 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/vxlan-interfaces.yml for testbed-manager, testbed-node-0, testbed-node-2, testbed-node-4, testbed-node-1, testbed-node-3, testbed-node-5 2025-09-23 19:02:09.962341 | orchestrator | 2025-09-23 19:02:09.962353 | orchestrator | TASK [osism.commons.network : Create systemd networkd netdev files] ************ 2025-09-23 19:02:09.962364 | orchestrator | Tuesday 23 September 2025 19:01:59 +0000 (0:00:04.532) 0:00:30.587 ***** 2025-09-23 19:02:09.962377 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'addresses': ['192.168.112.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962445 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962476 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962489 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962500 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962512 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962523 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962534 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962546 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.11/20'], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962557 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.10/20'], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962588 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.12/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962617 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.15/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962631 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.14/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962645 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.13/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962657 | orchestrator | 2025-09-23 19:02:09.962671 | orchestrator | TASK [osism.commons.network : Create systemd networkd network files] *********** 2025-09-23 19:02:09.962684 | orchestrator | Tuesday 23 September 2025 19:02:04 +0000 (0:00:05.192) 0:00:35.779 ***** 2025-09-23 19:02:09.962698 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'addresses': ['192.168.112.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962711 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962724 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962741 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962755 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962769 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962782 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962796 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 42}}) 2025-09-23 19:02:09.962816 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.11/20'], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962829 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.10/20'], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962842 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.15/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962857 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.13/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:09.962881 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.12/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:15.314850 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.14/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 23}}) 2025-09-23 19:02:15.314956 | orchestrator | 2025-09-23 19:02:15.314972 | orchestrator | TASK [osism.commons.network : Include networkd cleanup tasks] ****************** 2025-09-23 19:02:15.314984 | orchestrator | Tuesday 23 September 2025 19:02:09 +0000 (0:00:05.374) 0:00:41.154 ***** 2025-09-23 19:02:15.314996 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-networkd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:02:15.315007 | orchestrator | 2025-09-23 19:02:15.315017 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2025-09-23 19:02:15.315027 | orchestrator | Tuesday 23 September 2025 19:02:11 +0000 (0:00:01.106) 0:00:42.261 ***** 2025-09-23 19:02:15.315037 | orchestrator | ok: [testbed-manager] 2025-09-23 19:02:15.315048 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:02:15.315057 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:02:15.315067 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:02:15.315077 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:02:15.315086 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:02:15.315096 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:02:15.315106 | orchestrator | 2025-09-23 19:02:15.315115 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2025-09-23 19:02:15.315125 | orchestrator | Tuesday 23 September 2025 19:02:12 +0000 (0:00:01.077) 0:00:43.339 ***** 2025-09-23 19:02:15.315135 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315146 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315155 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315178 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315189 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:02:15.315199 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315209 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315218 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315248 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315258 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315268 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315277 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315287 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315296 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:02:15.315306 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315315 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315325 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315334 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315344 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:02:15.315353 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315363 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315372 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315383 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315421 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:02:15.315432 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315443 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315454 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315465 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315475 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:02:15.315486 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:02:15.315497 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-23 19:02:15.315507 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-23 19:02:15.315518 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-23 19:02:15.315529 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-23 19:02:15.315540 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:02:15.315551 | orchestrator | 2025-09-23 19:02:15.315561 | orchestrator | RUNNING HANDLER [osism.commons.network : Reload systemd-networkd] ************** 2025-09-23 19:02:15.315588 | orchestrator | Tuesday 23 September 2025 19:02:13 +0000 (0:00:01.600) 0:00:44.940 ***** 2025-09-23 19:02:15.315599 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:02:15.315610 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:02:15.315621 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:02:15.315633 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:02:15.315644 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:02:15.315655 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:02:15.315666 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:02:15.315677 | orchestrator | 2025-09-23 19:02:15.315688 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2025-09-23 19:02:15.315699 | orchestrator | Tuesday 23 September 2025 19:02:14 +0000 (0:00:00.562) 0:00:45.502 ***** 2025-09-23 19:02:15.315711 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:02:15.315722 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:02:15.315733 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:02:15.315752 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:02:15.315762 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:02:15.315771 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:02:15.315781 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:02:15.315790 | orchestrator | 2025-09-23 19:02:15.315800 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:02:15.315811 | orchestrator | testbed-manager : ok=21  changed=5  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-09-23 19:02:15.315822 | orchestrator | testbed-node-0 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 19:02:15.315832 | orchestrator | testbed-node-1 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 19:02:15.315842 | orchestrator | testbed-node-2 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 19:02:15.315856 | orchestrator | testbed-node-3 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 19:02:15.315866 | orchestrator | testbed-node-4 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 19:02:15.315876 | orchestrator | testbed-node-5 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-23 19:02:15.315886 | orchestrator | 2025-09-23 19:02:15.315896 | orchestrator | 2025-09-23 19:02:15.315906 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:02:15.315916 | orchestrator | Tuesday 23 September 2025 19:02:14 +0000 (0:00:00.684) 0:00:46.187 ***** 2025-09-23 19:02:15.315925 | orchestrator | =============================================================================== 2025-09-23 19:02:15.315935 | orchestrator | osism.commons.network : Create systemd networkd network files ----------- 5.37s 2025-09-23 19:02:15.315945 | orchestrator | osism.commons.network : Create systemd networkd netdev files ------------ 5.19s 2025-09-23 19:02:15.315954 | orchestrator | osism.commons.network : Include vxlan interfaces ------------------------ 4.53s 2025-09-23 19:02:15.315964 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 3.12s 2025-09-23 19:02:15.315973 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.23s 2025-09-23 19:02:15.315983 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.06s 2025-09-23 19:02:15.315992 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 2.02s 2025-09-23 19:02:15.316002 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.97s 2025-09-23 19:02:15.316011 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 1.79s 2025-09-23 19:02:15.316021 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.60s 2025-09-23 19:02:15.316031 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.60s 2025-09-23 19:02:15.316040 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.39s 2025-09-23 19:02:15.316050 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.25s 2025-09-23 19:02:15.316059 | orchestrator | osism.commons.network : Include cleanup tasks --------------------------- 1.23s 2025-09-23 19:02:15.316069 | orchestrator | osism.commons.network : Include type specific tasks --------------------- 1.19s 2025-09-23 19:02:15.316078 | orchestrator | osism.commons.network : Check if path for interface file exists --------- 1.14s 2025-09-23 19:02:15.316088 | orchestrator | osism.commons.network : Include networkd cleanup tasks ------------------ 1.11s 2025-09-23 19:02:15.316097 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.08s 2025-09-23 19:02:15.316113 | orchestrator | osism.commons.network : Create required directories --------------------- 0.96s 2025-09-23 19:02:15.316123 | orchestrator | osism.commons.network : Copy dispatcher scripts ------------------------- 0.90s 2025-09-23 19:02:15.597181 | orchestrator | + osism apply wireguard 2025-09-23 19:02:27.644108 | orchestrator | 2025-09-23 19:02:27 | INFO  | Task c2c1ff36-6745-4d1c-ba9f-9e790494a1d6 (wireguard) was prepared for execution. 2025-09-23 19:02:27.644200 | orchestrator | 2025-09-23 19:02:27 | INFO  | It takes a moment until task c2c1ff36-6745-4d1c-ba9f-9e790494a1d6 (wireguard) has been started and output is visible here. 2025-09-23 19:02:46.145296 | orchestrator | 2025-09-23 19:02:46.145430 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2025-09-23 19:02:46.145449 | orchestrator | 2025-09-23 19:02:46.145462 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2025-09-23 19:02:46.145474 | orchestrator | Tuesday 23 September 2025 19:02:31 +0000 (0:00:00.223) 0:00:00.223 ***** 2025-09-23 19:02:46.145485 | orchestrator | ok: [testbed-manager] 2025-09-23 19:02:46.145497 | orchestrator | 2025-09-23 19:02:46.145509 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2025-09-23 19:02:46.145520 | orchestrator | Tuesday 23 September 2025 19:02:33 +0000 (0:00:01.473) 0:00:01.696 ***** 2025-09-23 19:02:46.145531 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145542 | orchestrator | 2025-09-23 19:02:46.145553 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2025-09-23 19:02:46.145564 | orchestrator | Tuesday 23 September 2025 19:02:39 +0000 (0:00:06.199) 0:00:07.896 ***** 2025-09-23 19:02:46.145575 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145585 | orchestrator | 2025-09-23 19:02:46.145596 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2025-09-23 19:02:46.145607 | orchestrator | Tuesday 23 September 2025 19:02:39 +0000 (0:00:00.518) 0:00:08.414 ***** 2025-09-23 19:02:46.145618 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145629 | orchestrator | 2025-09-23 19:02:46.145640 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2025-09-23 19:02:46.145651 | orchestrator | Tuesday 23 September 2025 19:02:40 +0000 (0:00:00.386) 0:00:08.800 ***** 2025-09-23 19:02:46.145661 | orchestrator | ok: [testbed-manager] 2025-09-23 19:02:46.145672 | orchestrator | 2025-09-23 19:02:46.145683 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2025-09-23 19:02:46.145694 | orchestrator | Tuesday 23 September 2025 19:02:40 +0000 (0:00:00.506) 0:00:09.307 ***** 2025-09-23 19:02:46.145705 | orchestrator | ok: [testbed-manager] 2025-09-23 19:02:46.145716 | orchestrator | 2025-09-23 19:02:46.145727 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2025-09-23 19:02:46.145738 | orchestrator | Tuesday 23 September 2025 19:02:41 +0000 (0:00:00.444) 0:00:09.752 ***** 2025-09-23 19:02:46.145749 | orchestrator | ok: [testbed-manager] 2025-09-23 19:02:46.145760 | orchestrator | 2025-09-23 19:02:46.145781 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2025-09-23 19:02:46.145792 | orchestrator | Tuesday 23 September 2025 19:02:41 +0000 (0:00:00.402) 0:00:10.154 ***** 2025-09-23 19:02:46.145803 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145814 | orchestrator | 2025-09-23 19:02:46.145825 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2025-09-23 19:02:46.145838 | orchestrator | Tuesday 23 September 2025 19:02:42 +0000 (0:00:01.005) 0:00:11.159 ***** 2025-09-23 19:02:46.145851 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-23 19:02:46.145865 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145878 | orchestrator | 2025-09-23 19:02:46.145890 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2025-09-23 19:02:46.145903 | orchestrator | Tuesday 23 September 2025 19:02:43 +0000 (0:00:00.911) 0:00:12.071 ***** 2025-09-23 19:02:46.145915 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145928 | orchestrator | 2025-09-23 19:02:46.145940 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2025-09-23 19:02:46.145973 | orchestrator | Tuesday 23 September 2025 19:02:45 +0000 (0:00:01.578) 0:00:13.649 ***** 2025-09-23 19:02:46.145986 | orchestrator | changed: [testbed-manager] 2025-09-23 19:02:46.145999 | orchestrator | 2025-09-23 19:02:46.146011 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:02:46.146077 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:02:46.146100 | orchestrator | 2025-09-23 19:02:46.146119 | orchestrator | 2025-09-23 19:02:46.146133 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:02:46.146144 | orchestrator | Tuesday 23 September 2025 19:02:45 +0000 (0:00:00.876) 0:00:14.526 ***** 2025-09-23 19:02:46.146155 | orchestrator | =============================================================================== 2025-09-23 19:02:46.146165 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 6.20s 2025-09-23 19:02:46.146176 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.58s 2025-09-23 19:02:46.146188 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.47s 2025-09-23 19:02:46.146199 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.01s 2025-09-23 19:02:46.146209 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 0.91s 2025-09-23 19:02:46.146220 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.88s 2025-09-23 19:02:46.146231 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.52s 2025-09-23 19:02:46.146242 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.51s 2025-09-23 19:02:46.146253 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.44s 2025-09-23 19:02:46.146264 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.40s 2025-09-23 19:02:46.146275 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.39s 2025-09-23 19:02:46.340791 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2025-09-23 19:02:46.373591 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2025-09-23 19:02:46.373670 | orchestrator | Dload Upload Total Spent Left Speed 2025-09-23 19:02:49.763794 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:01 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:02 --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- 0:00:03 --:--:-- 0 100 15 100 15 0 0 4 0 0:00:03 0:00:03 --:--:-- 4 2025-09-23 19:02:49.778754 | orchestrator | + osism apply --environment custom workarounds 2025-09-23 19:02:51.616251 | orchestrator | 2025-09-23 19:02:51 | INFO  | Trying to run play workarounds in environment custom 2025-09-23 19:03:01.707443 | orchestrator | 2025-09-23 19:03:01 | INFO  | Task 1cadb911-62ed-4994-b64c-accd9bdba6d4 (workarounds) was prepared for execution. 2025-09-23 19:03:01.707579 | orchestrator | 2025-09-23 19:03:01 | INFO  | It takes a moment until task 1cadb911-62ed-4994-b64c-accd9bdba6d4 (workarounds) has been started and output is visible here. 2025-09-23 19:03:26.403978 | orchestrator | 2025-09-23 19:03:26.404105 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:03:26.404123 | orchestrator | 2025-09-23 19:03:26.404135 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2025-09-23 19:03:26.404147 | orchestrator | Tuesday 23 September 2025 19:03:05 +0000 (0:00:00.146) 0:00:00.146 ***** 2025-09-23 19:03:26.404159 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404171 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404182 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404218 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404229 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404240 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404250 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2025-09-23 19:03:26.404261 | orchestrator | 2025-09-23 19:03:26.404286 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2025-09-23 19:03:26.404298 | orchestrator | 2025-09-23 19:03:26.404309 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-09-23 19:03:26.404320 | orchestrator | Tuesday 23 September 2025 19:03:06 +0000 (0:00:00.756) 0:00:00.903 ***** 2025-09-23 19:03:26.404331 | orchestrator | ok: [testbed-manager] 2025-09-23 19:03:26.404384 | orchestrator | 2025-09-23 19:03:26.404397 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2025-09-23 19:03:26.404408 | orchestrator | 2025-09-23 19:03:26.404418 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-09-23 19:03:26.404429 | orchestrator | Tuesday 23 September 2025 19:03:08 +0000 (0:00:02.294) 0:00:03.198 ***** 2025-09-23 19:03:26.404440 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:03:26.404451 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:03:26.404462 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:03:26.404472 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:03:26.404484 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:03:26.404494 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:03:26.404505 | orchestrator | 2025-09-23 19:03:26.404518 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2025-09-23 19:03:26.404531 | orchestrator | 2025-09-23 19:03:26.404543 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2025-09-23 19:03:26.404555 | orchestrator | Tuesday 23 September 2025 19:03:10 +0000 (0:00:01.730) 0:00:04.928 ***** 2025-09-23 19:03:26.404568 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-23 19:03:26.404583 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-23 19:03:26.404596 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-23 19:03:26.404608 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-23 19:03:26.404620 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-23 19:03:26.404632 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-23 19:03:26.404644 | orchestrator | 2025-09-23 19:03:26.404657 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2025-09-23 19:03:26.404669 | orchestrator | Tuesday 23 September 2025 19:03:11 +0000 (0:00:01.498) 0:00:06.427 ***** 2025-09-23 19:03:26.404682 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:03:26.404695 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:03:26.404707 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:03:26.404720 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:03:26.404732 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:03:26.404744 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:03:26.404756 | orchestrator | 2025-09-23 19:03:26.404768 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2025-09-23 19:03:26.404780 | orchestrator | Tuesday 23 September 2025 19:03:15 +0000 (0:00:03.772) 0:00:10.199 ***** 2025-09-23 19:03:26.404793 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:03:26.404805 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:03:26.404817 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:03:26.404830 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:03:26.404852 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:03:26.404864 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:03:26.404876 | orchestrator | 2025-09-23 19:03:26.404889 | orchestrator | PLAY [Add a workaround service] ************************************************ 2025-09-23 19:03:26.404899 | orchestrator | 2025-09-23 19:03:26.404910 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2025-09-23 19:03:26.404921 | orchestrator | Tuesday 23 September 2025 19:03:16 +0000 (0:00:00.709) 0:00:10.908 ***** 2025-09-23 19:03:26.404931 | orchestrator | changed: [testbed-manager] 2025-09-23 19:03:26.404942 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:03:26.404953 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:03:26.404963 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:03:26.404975 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:03:26.404986 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:03:26.404997 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:03:26.405007 | orchestrator | 2025-09-23 19:03:26.405018 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2025-09-23 19:03:26.405029 | orchestrator | Tuesday 23 September 2025 19:03:18 +0000 (0:00:01.661) 0:00:12.570 ***** 2025-09-23 19:03:26.405040 | orchestrator | changed: [testbed-manager] 2025-09-23 19:03:26.405051 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:03:26.405061 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:03:26.405072 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:03:26.405083 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:03:26.405094 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:03:26.405123 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:03:26.405135 | orchestrator | 2025-09-23 19:03:26.405146 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2025-09-23 19:03:26.405157 | orchestrator | Tuesday 23 September 2025 19:03:19 +0000 (0:00:01.605) 0:00:14.175 ***** 2025-09-23 19:03:26.405168 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:03:26.405178 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:03:26.405189 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:03:26.405200 | orchestrator | ok: [testbed-manager] 2025-09-23 19:03:26.405211 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:03:26.405222 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:03:26.405233 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:03:26.405243 | orchestrator | 2025-09-23 19:03:26.405254 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2025-09-23 19:03:26.405265 | orchestrator | Tuesday 23 September 2025 19:03:21 +0000 (0:00:01.474) 0:00:15.649 ***** 2025-09-23 19:03:26.405276 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:03:26.405287 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:03:26.405298 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:03:26.405309 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:03:26.405320 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:03:26.405331 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:03:26.405342 | orchestrator | changed: [testbed-manager] 2025-09-23 19:03:26.405385 | orchestrator | 2025-09-23 19:03:26.405397 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2025-09-23 19:03:26.405408 | orchestrator | Tuesday 23 September 2025 19:03:23 +0000 (0:00:02.107) 0:00:17.757 ***** 2025-09-23 19:03:26.405419 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:03:26.405430 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:03:26.405441 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:03:26.405452 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:03:26.405463 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:03:26.405474 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:03:26.405485 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:03:26.405496 | orchestrator | 2025-09-23 19:03:26.405507 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2025-09-23 19:03:26.405518 | orchestrator | 2025-09-23 19:03:26.405529 | orchestrator | TASK [Install python3-docker] ************************************************** 2025-09-23 19:03:26.405547 | orchestrator | Tuesday 23 September 2025 19:03:23 +0000 (0:00:00.532) 0:00:18.289 ***** 2025-09-23 19:03:26.405559 | orchestrator | ok: [testbed-manager] 2025-09-23 19:03:26.405570 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:03:26.405581 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:03:26.405595 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:03:26.405607 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:03:26.405618 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:03:26.405629 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:03:26.405640 | orchestrator | 2025-09-23 19:03:26.405651 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:03:26.405663 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:03:26.405675 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:26.405687 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:26.405698 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:26.405709 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:26.405720 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:26.405740 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:26.405752 | orchestrator | 2025-09-23 19:03:26.405763 | orchestrator | 2025-09-23 19:03:26.405774 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:03:26.405785 | orchestrator | Tuesday 23 September 2025 19:03:26 +0000 (0:00:02.641) 0:00:20.931 ***** 2025-09-23 19:03:26.405796 | orchestrator | =============================================================================== 2025-09-23 19:03:26.405807 | orchestrator | Run update-ca-certificates ---------------------------------------------- 3.77s 2025-09-23 19:03:26.405818 | orchestrator | Install python3-docker -------------------------------------------------- 2.64s 2025-09-23 19:03:26.405829 | orchestrator | Apply netplan configuration --------------------------------------------- 2.29s 2025-09-23 19:03:26.405840 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 2.11s 2025-09-23 19:03:26.405851 | orchestrator | Apply netplan configuration --------------------------------------------- 1.73s 2025-09-23 19:03:26.405861 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.66s 2025-09-23 19:03:26.405872 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.61s 2025-09-23 19:03:26.405883 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.50s 2025-09-23 19:03:26.405894 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.47s 2025-09-23 19:03:26.405905 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.76s 2025-09-23 19:03:26.405916 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.71s 2025-09-23 19:03:26.405934 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.53s 2025-09-23 19:03:26.974904 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2025-09-23 19:03:39.069560 | orchestrator | 2025-09-23 19:03:39 | INFO  | Task 648191da-04d4-4739-9d2a-ca38c595e9f5 (reboot) was prepared for execution. 2025-09-23 19:03:39.069692 | orchestrator | 2025-09-23 19:03:39 | INFO  | It takes a moment until task 648191da-04d4-4739-9d2a-ca38c595e9f5 (reboot) has been started and output is visible here. 2025-09-23 19:03:48.944713 | orchestrator | 2025-09-23 19:03:48.944823 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-23 19:03:48.944841 | orchestrator | 2025-09-23 19:03:48.944854 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-23 19:03:48.944866 | orchestrator | Tuesday 23 September 2025 19:03:43 +0000 (0:00:00.208) 0:00:00.208 ***** 2025-09-23 19:03:48.944878 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:03:48.944890 | orchestrator | 2025-09-23 19:03:48.944902 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-23 19:03:48.944926 | orchestrator | Tuesday 23 September 2025 19:03:43 +0000 (0:00:00.101) 0:00:00.309 ***** 2025-09-23 19:03:48.944937 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:03:48.944948 | orchestrator | 2025-09-23 19:03:48.944959 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-23 19:03:48.944971 | orchestrator | Tuesday 23 September 2025 19:03:44 +0000 (0:00:00.936) 0:00:01.246 ***** 2025-09-23 19:03:48.944982 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:03:48.944994 | orchestrator | 2025-09-23 19:03:48.945005 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-23 19:03:48.945016 | orchestrator | 2025-09-23 19:03:48.945028 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-23 19:03:48.945039 | orchestrator | Tuesday 23 September 2025 19:03:44 +0000 (0:00:00.129) 0:00:01.376 ***** 2025-09-23 19:03:48.945050 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:03:48.945061 | orchestrator | 2025-09-23 19:03:48.945072 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-23 19:03:48.945084 | orchestrator | Tuesday 23 September 2025 19:03:44 +0000 (0:00:00.093) 0:00:01.469 ***** 2025-09-23 19:03:48.945095 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:03:48.945106 | orchestrator | 2025-09-23 19:03:48.945117 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-23 19:03:48.945128 | orchestrator | Tuesday 23 September 2025 19:03:44 +0000 (0:00:00.684) 0:00:02.153 ***** 2025-09-23 19:03:48.945139 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:03:48.945150 | orchestrator | 2025-09-23 19:03:48.945162 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-23 19:03:48.945172 | orchestrator | 2025-09-23 19:03:48.945183 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-23 19:03:48.945194 | orchestrator | Tuesday 23 September 2025 19:03:45 +0000 (0:00:00.109) 0:00:02.262 ***** 2025-09-23 19:03:48.945205 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:03:48.945217 | orchestrator | 2025-09-23 19:03:48.945228 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-23 19:03:48.945239 | orchestrator | Tuesday 23 September 2025 19:03:45 +0000 (0:00:00.187) 0:00:02.450 ***** 2025-09-23 19:03:48.945250 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:03:48.945263 | orchestrator | 2025-09-23 19:03:48.945276 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-23 19:03:48.945289 | orchestrator | Tuesday 23 September 2025 19:03:45 +0000 (0:00:00.680) 0:00:03.130 ***** 2025-09-23 19:03:48.945301 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:03:48.945314 | orchestrator | 2025-09-23 19:03:48.945326 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-23 19:03:48.945362 | orchestrator | 2025-09-23 19:03:48.945375 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-23 19:03:48.945387 | orchestrator | Tuesday 23 September 2025 19:03:46 +0000 (0:00:00.111) 0:00:03.242 ***** 2025-09-23 19:03:48.945399 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:03:48.945412 | orchestrator | 2025-09-23 19:03:48.945424 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-23 19:03:48.945437 | orchestrator | Tuesday 23 September 2025 19:03:46 +0000 (0:00:00.098) 0:00:03.341 ***** 2025-09-23 19:03:48.945449 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:03:48.945483 | orchestrator | 2025-09-23 19:03:48.945496 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-23 19:03:48.945509 | orchestrator | Tuesday 23 September 2025 19:03:46 +0000 (0:00:00.697) 0:00:04.038 ***** 2025-09-23 19:03:48.945521 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:03:48.945533 | orchestrator | 2025-09-23 19:03:48.945546 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-23 19:03:48.945558 | orchestrator | 2025-09-23 19:03:48.945571 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-23 19:03:48.945584 | orchestrator | Tuesday 23 September 2025 19:03:46 +0000 (0:00:00.109) 0:00:04.148 ***** 2025-09-23 19:03:48.945596 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:03:48.945609 | orchestrator | 2025-09-23 19:03:48.945621 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-23 19:03:48.945634 | orchestrator | Tuesday 23 September 2025 19:03:47 +0000 (0:00:00.112) 0:00:04.261 ***** 2025-09-23 19:03:48.945646 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:03:48.945659 | orchestrator | 2025-09-23 19:03:48.945670 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-23 19:03:48.945681 | orchestrator | Tuesday 23 September 2025 19:03:47 +0000 (0:00:00.649) 0:00:04.910 ***** 2025-09-23 19:03:48.945691 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:03:48.945702 | orchestrator | 2025-09-23 19:03:48.945713 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-23 19:03:48.945724 | orchestrator | 2025-09-23 19:03:48.945735 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-23 19:03:48.945746 | orchestrator | Tuesday 23 September 2025 19:03:47 +0000 (0:00:00.106) 0:00:05.017 ***** 2025-09-23 19:03:48.945757 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:03:48.945769 | orchestrator | 2025-09-23 19:03:48.945780 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-23 19:03:48.945791 | orchestrator | Tuesday 23 September 2025 19:03:47 +0000 (0:00:00.137) 0:00:05.154 ***** 2025-09-23 19:03:48.945802 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:03:48.945813 | orchestrator | 2025-09-23 19:03:48.945824 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-23 19:03:48.945835 | orchestrator | Tuesday 23 September 2025 19:03:48 +0000 (0:00:00.654) 0:00:05.809 ***** 2025-09-23 19:03:48.945864 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:03:48.945876 | orchestrator | 2025-09-23 19:03:48.945887 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:03:48.945900 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:48.945918 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:48.945929 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:48.945941 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:48.945952 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:48.945963 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:03:48.945973 | orchestrator | 2025-09-23 19:03:48.945985 | orchestrator | 2025-09-23 19:03:48.945996 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:03:48.946007 | orchestrator | Tuesday 23 September 2025 19:03:48 +0000 (0:00:00.037) 0:00:05.846 ***** 2025-09-23 19:03:48.946078 | orchestrator | =============================================================================== 2025-09-23 19:03:48.946092 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 4.30s 2025-09-23 19:03:48.946103 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.73s 2025-09-23 19:03:48.946114 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.60s 2025-09-23 19:03:49.236793 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2025-09-23 19:04:01.282732 | orchestrator | 2025-09-23 19:04:01 | INFO  | Task a315e427-76fb-49d0-b9b0-b7458f13ec44 (wait-for-connection) was prepared for execution. 2025-09-23 19:04:01.282834 | orchestrator | 2025-09-23 19:04:01 | INFO  | It takes a moment until task a315e427-76fb-49d0-b9b0-b7458f13ec44 (wait-for-connection) has been started and output is visible here. 2025-09-23 19:04:16.805774 | orchestrator | 2025-09-23 19:04:16.805889 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2025-09-23 19:04:16.805905 | orchestrator | 2025-09-23 19:04:16.805918 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2025-09-23 19:04:16.805930 | orchestrator | Tuesday 23 September 2025 19:04:05 +0000 (0:00:00.173) 0:00:00.173 ***** 2025-09-23 19:04:16.805941 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:04:16.805954 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:04:16.805965 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:04:16.805976 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:04:16.805987 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:04:16.805998 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:04:16.806009 | orchestrator | 2025-09-23 19:04:16.806077 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:04:16.806091 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:16.806104 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:16.806116 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:16.806128 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:16.806149 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:16.806161 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:16.806172 | orchestrator | 2025-09-23 19:04:16.806183 | orchestrator | 2025-09-23 19:04:16.806195 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:04:16.806206 | orchestrator | Tuesday 23 September 2025 19:04:16 +0000 (0:00:11.480) 0:00:11.654 ***** 2025-09-23 19:04:16.806217 | orchestrator | =============================================================================== 2025-09-23 19:04:16.806228 | orchestrator | Wait until remote system is reachable ---------------------------------- 11.48s 2025-09-23 19:04:17.058306 | orchestrator | + osism apply hddtemp 2025-09-23 19:04:28.992518 | orchestrator | 2025-09-23 19:04:28 | INFO  | Task a29ffdbc-e9b7-4810-b5e7-bde454ac50c9 (hddtemp) was prepared for execution. 2025-09-23 19:04:28.992613 | orchestrator | 2025-09-23 19:04:28 | INFO  | It takes a moment until task a29ffdbc-e9b7-4810-b5e7-bde454ac50c9 (hddtemp) has been started and output is visible here. 2025-09-23 19:04:55.910869 | orchestrator | 2025-09-23 19:04:55.910944 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2025-09-23 19:04:55.910950 | orchestrator | 2025-09-23 19:04:55.910955 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2025-09-23 19:04:55.910960 | orchestrator | Tuesday 23 September 2025 19:04:32 +0000 (0:00:00.229) 0:00:00.229 ***** 2025-09-23 19:04:55.910978 | orchestrator | ok: [testbed-manager] 2025-09-23 19:04:55.910983 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:04:55.910988 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:04:55.910991 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:04:55.910995 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:04:55.910999 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:04:55.911003 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:04:55.911007 | orchestrator | 2025-09-23 19:04:55.911021 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2025-09-23 19:04:55.911025 | orchestrator | Tuesday 23 September 2025 19:04:33 +0000 (0:00:00.600) 0:00:00.830 ***** 2025-09-23 19:04:55.911030 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:04:55.911035 | orchestrator | 2025-09-23 19:04:55.911039 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2025-09-23 19:04:55.911043 | orchestrator | Tuesday 23 September 2025 19:04:34 +0000 (0:00:01.023) 0:00:01.854 ***** 2025-09-23 19:04:55.911047 | orchestrator | ok: [testbed-manager] 2025-09-23 19:04:55.911051 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:04:55.911055 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:04:55.911059 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:04:55.911063 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:04:55.911067 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:04:55.911071 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:04:55.911074 | orchestrator | 2025-09-23 19:04:55.911078 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2025-09-23 19:04:55.911082 | orchestrator | Tuesday 23 September 2025 19:04:36 +0000 (0:00:01.940) 0:00:03.794 ***** 2025-09-23 19:04:55.911086 | orchestrator | changed: [testbed-manager] 2025-09-23 19:04:55.911091 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:04:55.911095 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:04:55.911098 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:04:55.911102 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:04:55.911106 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:04:55.911110 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:04:55.911114 | orchestrator | 2025-09-23 19:04:55.911118 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2025-09-23 19:04:55.911122 | orchestrator | Tuesday 23 September 2025 19:04:37 +0000 (0:00:01.012) 0:00:04.806 ***** 2025-09-23 19:04:55.911126 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:04:55.911130 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:04:55.911133 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:04:55.911137 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:04:55.911141 | orchestrator | ok: [testbed-manager] 2025-09-23 19:04:55.911145 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:04:55.911149 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:04:55.911152 | orchestrator | 2025-09-23 19:04:55.911156 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2025-09-23 19:04:55.911160 | orchestrator | Tuesday 23 September 2025 19:04:38 +0000 (0:00:01.088) 0:00:05.895 ***** 2025-09-23 19:04:55.911164 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:04:55.911168 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:04:55.911172 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:04:55.911176 | orchestrator | changed: [testbed-manager] 2025-09-23 19:04:55.911180 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:04:55.911184 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:04:55.911187 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:04:55.911191 | orchestrator | 2025-09-23 19:04:55.911195 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2025-09-23 19:04:55.911199 | orchestrator | Tuesday 23 September 2025 19:04:39 +0000 (0:00:00.683) 0:00:06.579 ***** 2025-09-23 19:04:55.911203 | orchestrator | changed: [testbed-manager] 2025-09-23 19:04:55.911210 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:04:55.911214 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:04:55.911218 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:04:55.911221 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:04:55.911225 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:04:55.911229 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:04:55.911233 | orchestrator | 2025-09-23 19:04:55.911237 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2025-09-23 19:04:55.911240 | orchestrator | Tuesday 23 September 2025 19:04:52 +0000 (0:00:13.296) 0:00:19.876 ***** 2025-09-23 19:04:55.911244 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:04:55.911248 | orchestrator | 2025-09-23 19:04:55.911252 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2025-09-23 19:04:55.911256 | orchestrator | Tuesday 23 September 2025 19:04:53 +0000 (0:00:01.342) 0:00:21.218 ***** 2025-09-23 19:04:55.911260 | orchestrator | changed: [testbed-manager] 2025-09-23 19:04:55.911263 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:04:55.911267 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:04:55.911271 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:04:55.911275 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:04:55.911279 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:04:55.911282 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:04:55.911286 | orchestrator | 2025-09-23 19:04:55.911306 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:04:55.911310 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:04:55.911325 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:04:55.911329 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:04:55.911333 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:04:55.911340 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:04:55.911344 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:04:55.911348 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:04:55.911352 | orchestrator | 2025-09-23 19:04:55.911356 | orchestrator | 2025-09-23 19:04:55.911359 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:04:55.911363 | orchestrator | Tuesday 23 September 2025 19:04:55 +0000 (0:00:01.853) 0:00:23.072 ***** 2025-09-23 19:04:55.911369 | orchestrator | =============================================================================== 2025-09-23 19:04:55.911375 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 13.30s 2025-09-23 19:04:55.911381 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 1.94s 2025-09-23 19:04:55.911388 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 1.85s 2025-09-23 19:04:55.911392 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 1.34s 2025-09-23 19:04:55.911395 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.09s 2025-09-23 19:04:55.911400 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.02s 2025-09-23 19:04:55.911410 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 1.01s 2025-09-23 19:04:55.911414 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.68s 2025-09-23 19:04:55.911418 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.60s 2025-09-23 19:04:56.162418 | orchestrator | ++ semver latest 7.1.1 2025-09-23 19:04:56.215845 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-23 19:04:56.215877 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-23 19:04:56.215882 | orchestrator | + sudo systemctl restart manager.service 2025-09-23 19:05:33.683562 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-09-23 19:05:33.683668 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-09-23 19:05:33.683684 | orchestrator | + local max_attempts=60 2025-09-23 19:05:33.683698 | orchestrator | + local name=ceph-ansible 2025-09-23 19:05:33.683709 | orchestrator | + local attempt_num=1 2025-09-23 19:05:33.683720 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:05:33.721648 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:05:33.721730 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:05:33.721743 | orchestrator | + sleep 5 2025-09-23 19:05:38.725246 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:05:38.765672 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:05:38.765754 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:05:38.765769 | orchestrator | + sleep 5 2025-09-23 19:05:43.770612 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:05:43.798683 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:05:43.798752 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:05:43.798765 | orchestrator | + sleep 5 2025-09-23 19:05:48.802222 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:05:48.839948 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:05:48.840013 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:05:48.840026 | orchestrator | + sleep 5 2025-09-23 19:05:53.845579 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:05:53.882966 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:05:53.883036 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:05:53.883043 | orchestrator | + sleep 5 2025-09-23 19:05:58.888858 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:05:58.935639 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:05:58.935716 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:05:58.935729 | orchestrator | + sleep 5 2025-09-23 19:06:03.940812 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:03.978209 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:03.978351 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:03.978377 | orchestrator | + sleep 5 2025-09-23 19:06:08.984650 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:09.055333 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:09.055448 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:09.055465 | orchestrator | + sleep 5 2025-09-23 19:06:14.057642 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:14.095497 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:14.095573 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:14.095587 | orchestrator | + sleep 5 2025-09-23 19:06:19.098436 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:19.142084 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:19.142170 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:19.142184 | orchestrator | + sleep 5 2025-09-23 19:06:24.147021 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:24.182548 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:24.182629 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:24.182644 | orchestrator | + sleep 5 2025-09-23 19:06:29.187963 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:29.226924 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:29.227022 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:29.227037 | orchestrator | + sleep 5 2025-09-23 19:06:34.232347 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:34.269411 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:34.269513 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-23 19:06:34.269537 | orchestrator | + sleep 5 2025-09-23 19:06:39.274412 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-23 19:06:39.304601 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:39.304669 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-09-23 19:06:39.304683 | orchestrator | + local max_attempts=60 2025-09-23 19:06:39.304696 | orchestrator | + local name=kolla-ansible 2025-09-23 19:06:39.304708 | orchestrator | + local attempt_num=1 2025-09-23 19:06:39.304893 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-09-23 19:06:39.337457 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:39.337526 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-09-23 19:06:39.337540 | orchestrator | + local max_attempts=60 2025-09-23 19:06:39.337553 | orchestrator | + local name=osism-ansible 2025-09-23 19:06:39.337564 | orchestrator | + local attempt_num=1 2025-09-23 19:06:39.338067 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-09-23 19:06:39.364666 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-23 19:06:39.364717 | orchestrator | + [[ true == \t\r\u\e ]] 2025-09-23 19:06:39.364731 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-09-23 19:06:39.525181 | orchestrator | ARA in ceph-ansible already disabled. 2025-09-23 19:06:39.689827 | orchestrator | ARA in kolla-ansible already disabled. 2025-09-23 19:06:39.989987 | orchestrator | ARA in osism-kubernetes already disabled. 2025-09-23 19:06:39.990528 | orchestrator | + osism apply gather-facts 2025-09-23 19:06:52.037971 | orchestrator | 2025-09-23 19:06:52 | INFO  | Task 18fc1fb0-c78d-4d52-8168-188fb0c663b5 (gather-facts) was prepared for execution. 2025-09-23 19:06:52.038147 | orchestrator | 2025-09-23 19:06:52 | INFO  | It takes a moment until task 18fc1fb0-c78d-4d52-8168-188fb0c663b5 (gather-facts) has been started and output is visible here. 2025-09-23 19:07:05.899494 | orchestrator | 2025-09-23 19:07:05.899594 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-23 19:07:05.899610 | orchestrator | 2025-09-23 19:07:05.899622 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-23 19:07:05.899634 | orchestrator | Tuesday 23 September 2025 19:06:55 +0000 (0:00:00.167) 0:00:00.167 ***** 2025-09-23 19:07:05.899645 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:07:05.899657 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:07:05.899668 | orchestrator | ok: [testbed-manager] 2025-09-23 19:07:05.899679 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:07:05.899690 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:07:05.899701 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:07:05.899712 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:07:05.899722 | orchestrator | 2025-09-23 19:07:05.899734 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-09-23 19:07:05.899744 | orchestrator | 2025-09-23 19:07:05.899755 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-09-23 19:07:05.899766 | orchestrator | Tuesday 23 September 2025 19:07:05 +0000 (0:00:09.502) 0:00:09.670 ***** 2025-09-23 19:07:05.899777 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:07:05.899789 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:07:05.899801 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:07:05.899812 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:07:05.899823 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:07:05.899834 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:07:05.899844 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:07:05.899855 | orchestrator | 2025-09-23 19:07:05.899866 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:07:05.899878 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.899918 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.899938 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.899954 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.899972 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.899991 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.900012 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:07:05.900032 | orchestrator | 2025-09-23 19:07:05.900047 | orchestrator | 2025-09-23 19:07:05.900060 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:07:05.900073 | orchestrator | Tuesday 23 September 2025 19:07:05 +0000 (0:00:00.447) 0:00:10.118 ***** 2025-09-23 19:07:05.900085 | orchestrator | =============================================================================== 2025-09-23 19:07:05.900097 | orchestrator | Gathers facts about hosts ----------------------------------------------- 9.50s 2025-09-23 19:07:05.900110 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.45s 2025-09-23 19:07:06.121153 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2025-09-23 19:07:06.135414 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2025-09-23 19:07:06.144056 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2025-09-23 19:07:06.152668 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2025-09-23 19:07:06.165615 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2025-09-23 19:07:06.180593 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2025-09-23 19:07:06.190144 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2025-09-23 19:07:06.201937 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2025-09-23 19:07:06.211168 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2025-09-23 19:07:06.221028 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2025-09-23 19:07:06.231670 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2025-09-23 19:07:06.242453 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2025-09-23 19:07:06.255144 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2025-09-23 19:07:06.266568 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2025-09-23 19:07:06.278363 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2025-09-23 19:07:06.292261 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2025-09-23 19:07:06.301597 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amhpora-image.sh /usr/local/bin/bootstrap-octavia 2025-09-23 19:07:06.312738 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2025-09-23 19:07:06.321609 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2025-09-23 19:07:06.334567 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2025-09-23 19:07:06.344490 | orchestrator | + [[ false == \t\r\u\e ]] 2025-09-23 19:07:06.749792 | orchestrator | ok: Runtime: 0:23:33.413605 2025-09-23 19:07:06.858408 | 2025-09-23 19:07:06.858542 | TASK [Deploy services] 2025-09-23 19:07:07.392314 | orchestrator | skipping: Conditional result was False 2025-09-23 19:07:07.412004 | 2025-09-23 19:07:07.412200 | TASK [Deploy in a nutshell] 2025-09-23 19:07:08.068961 | orchestrator | 2025-09-23 19:07:08.069127 | orchestrator | # PULL IMAGES 2025-09-23 19:07:08.069153 | orchestrator | 2025-09-23 19:07:08.069168 | orchestrator | + set -e 2025-09-23 19:07:08.069187 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-09-23 19:07:08.069208 | orchestrator | ++ export INTERACTIVE=false 2025-09-23 19:07:08.069245 | orchestrator | ++ INTERACTIVE=false 2025-09-23 19:07:08.069289 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-09-23 19:07:08.069312 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-09-23 19:07:08.069327 | orchestrator | + source /opt/manager-vars.sh 2025-09-23 19:07:08.069339 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-09-23 19:07:08.069357 | orchestrator | ++ NUMBER_OF_NODES=6 2025-09-23 19:07:08.069369 | orchestrator | ++ export CEPH_VERSION=reef 2025-09-23 19:07:08.069387 | orchestrator | ++ CEPH_VERSION=reef 2025-09-23 19:07:08.069399 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-09-23 19:07:08.069416 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-09-23 19:07:08.069427 | orchestrator | ++ export MANAGER_VERSION=latest 2025-09-23 19:07:08.069441 | orchestrator | ++ MANAGER_VERSION=latest 2025-09-23 19:07:08.069453 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2025-09-23 19:07:08.069466 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2025-09-23 19:07:08.069477 | orchestrator | ++ export ARA=false 2025-09-23 19:07:08.069488 | orchestrator | ++ ARA=false 2025-09-23 19:07:08.069499 | orchestrator | ++ export DEPLOY_MODE=manager 2025-09-23 19:07:08.069511 | orchestrator | ++ DEPLOY_MODE=manager 2025-09-23 19:07:08.069522 | orchestrator | ++ export TEMPEST=false 2025-09-23 19:07:08.069533 | orchestrator | ++ TEMPEST=false 2025-09-23 19:07:08.069544 | orchestrator | ++ export IS_ZUUL=true 2025-09-23 19:07:08.069555 | orchestrator | ++ IS_ZUUL=true 2025-09-23 19:07:08.069566 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 19:07:08.069578 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.123 2025-09-23 19:07:08.069589 | orchestrator | ++ export EXTERNAL_API=false 2025-09-23 19:07:08.069600 | orchestrator | ++ EXTERNAL_API=false 2025-09-23 19:07:08.069610 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-09-23 19:07:08.069622 | orchestrator | ++ IMAGE_USER=ubuntu 2025-09-23 19:07:08.069633 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-09-23 19:07:08.069644 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-09-23 19:07:08.069655 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-09-23 19:07:08.069666 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-09-23 19:07:08.069677 | orchestrator | + echo 2025-09-23 19:07:08.069694 | orchestrator | + echo '# PULL IMAGES' 2025-09-23 19:07:08.069707 | orchestrator | + echo 2025-09-23 19:07:08.069731 | orchestrator | ++ semver latest 7.0.0 2025-09-23 19:07:08.115757 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-23 19:07:08.115805 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-23 19:07:08.115817 | orchestrator | + osism apply --no-wait -r 2 -e custom pull-images 2025-09-23 19:07:09.917477 | orchestrator | 2025-09-23 19:07:09 | INFO  | Trying to run play pull-images in environment custom 2025-09-23 19:07:19.997509 | orchestrator | 2025-09-23 19:07:19 | INFO  | Task 7186df93-35bd-4ad5-b7ca-c405259303fb (pull-images) was prepared for execution. 2025-09-23 19:07:19.997618 | orchestrator | 2025-09-23 19:07:19 | INFO  | Task 7186df93-35bd-4ad5-b7ca-c405259303fb is running in background. No more output. Check ARA for logs. 2025-09-23 19:07:21.935299 | orchestrator | 2025-09-23 19:07:21 | INFO  | Trying to run play wipe-partitions in environment custom 2025-09-23 19:07:32.096094 | orchestrator | 2025-09-23 19:07:32 | INFO  | Task 4c8c9413-aa68-451b-befc-5c06f84b497f (wipe-partitions) was prepared for execution. 2025-09-23 19:07:32.096939 | orchestrator | 2025-09-23 19:07:32 | INFO  | It takes a moment until task 4c8c9413-aa68-451b-befc-5c06f84b497f (wipe-partitions) has been started and output is visible here. 2025-09-23 19:07:44.013860 | orchestrator | 2025-09-23 19:07:44.013978 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2025-09-23 19:07:44.013997 | orchestrator | 2025-09-23 19:07:44.014011 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2025-09-23 19:07:44.014134 | orchestrator | Tuesday 23 September 2025 19:07:35 +0000 (0:00:00.121) 0:00:00.121 ***** 2025-09-23 19:07:44.014148 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:07:44.014160 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:07:44.014172 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:07:44.014184 | orchestrator | 2025-09-23 19:07:44.014195 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2025-09-23 19:07:44.014285 | orchestrator | Tuesday 23 September 2025 19:07:36 +0000 (0:00:00.579) 0:00:00.700 ***** 2025-09-23 19:07:44.014305 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:07:44.014323 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:07:44.014345 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:07:44.014364 | orchestrator | 2025-09-23 19:07:44.014384 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2025-09-23 19:07:44.014404 | orchestrator | Tuesday 23 September 2025 19:07:36 +0000 (0:00:00.237) 0:00:00.938 ***** 2025-09-23 19:07:44.014422 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:07:44.014442 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:07:44.014460 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:07:44.014477 | orchestrator | 2025-09-23 19:07:44.014496 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2025-09-23 19:07:44.014516 | orchestrator | Tuesday 23 September 2025 19:07:37 +0000 (0:00:00.664) 0:00:01.602 ***** 2025-09-23 19:07:44.014537 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:07:44.014555 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:07:44.014573 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:07:44.014600 | orchestrator | 2025-09-23 19:07:44.014621 | orchestrator | TASK [Check device availability] *********************************************** 2025-09-23 19:07:44.014639 | orchestrator | Tuesday 23 September 2025 19:07:37 +0000 (0:00:00.255) 0:00:01.858 ***** 2025-09-23 19:07:44.014657 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-09-23 19:07:44.014680 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-09-23 19:07:44.014699 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-09-23 19:07:44.014716 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-09-23 19:07:44.014735 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-09-23 19:07:44.014753 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-09-23 19:07:44.014771 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-09-23 19:07:44.014783 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-09-23 19:07:44.014794 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-09-23 19:07:44.014804 | orchestrator | 2025-09-23 19:07:44.014815 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2025-09-23 19:07:44.014827 | orchestrator | Tuesday 23 September 2025 19:07:38 +0000 (0:00:01.190) 0:00:03.049 ***** 2025-09-23 19:07:44.014838 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2025-09-23 19:07:44.014849 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2025-09-23 19:07:44.014859 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2025-09-23 19:07:44.014870 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2025-09-23 19:07:44.014880 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2025-09-23 19:07:44.014891 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2025-09-23 19:07:44.014901 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2025-09-23 19:07:44.014912 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2025-09-23 19:07:44.014922 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2025-09-23 19:07:44.014933 | orchestrator | 2025-09-23 19:07:44.014944 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2025-09-23 19:07:44.014955 | orchestrator | Tuesday 23 September 2025 19:07:40 +0000 (0:00:01.370) 0:00:04.419 ***** 2025-09-23 19:07:44.014965 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-09-23 19:07:44.014976 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-09-23 19:07:44.014986 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-09-23 19:07:44.014997 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-09-23 19:07:44.015007 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-09-23 19:07:44.015018 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-09-23 19:07:44.015028 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-09-23 19:07:44.015051 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-09-23 19:07:44.015068 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-09-23 19:07:44.015080 | orchestrator | 2025-09-23 19:07:44.015090 | orchestrator | TASK [Reload udev rules] ******************************************************* 2025-09-23 19:07:44.015101 | orchestrator | Tuesday 23 September 2025 19:07:42 +0000 (0:00:02.410) 0:00:06.830 ***** 2025-09-23 19:07:44.015112 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:07:44.015122 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:07:44.015133 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:07:44.015143 | orchestrator | 2025-09-23 19:07:44.015154 | orchestrator | TASK [Request device events from the kernel] *********************************** 2025-09-23 19:07:44.015164 | orchestrator | Tuesday 23 September 2025 19:07:43 +0000 (0:00:00.603) 0:00:07.434 ***** 2025-09-23 19:07:44.015175 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:07:44.015186 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:07:44.015196 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:07:44.015250 | orchestrator | 2025-09-23 19:07:44.015262 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:07:44.015277 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:07:44.015289 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:07:44.015322 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:07:44.015334 | orchestrator | 2025-09-23 19:07:44.015345 | orchestrator | 2025-09-23 19:07:44.015356 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:07:44.015367 | orchestrator | Tuesday 23 September 2025 19:07:43 +0000 (0:00:00.613) 0:00:08.047 ***** 2025-09-23 19:07:44.015377 | orchestrator | =============================================================================== 2025-09-23 19:07:44.015388 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.41s 2025-09-23 19:07:44.015399 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.37s 2025-09-23 19:07:44.015410 | orchestrator | Check device availability ----------------------------------------------- 1.19s 2025-09-23 19:07:44.015421 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.66s 2025-09-23 19:07:44.015431 | orchestrator | Request device events from the kernel ----------------------------------- 0.61s 2025-09-23 19:07:44.015442 | orchestrator | Reload udev rules ------------------------------------------------------- 0.60s 2025-09-23 19:07:44.015453 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 0.58s 2025-09-23 19:07:44.015463 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.26s 2025-09-23 19:07:44.015474 | orchestrator | Remove all rook related logical devices --------------------------------- 0.24s 2025-09-23 19:07:56.256263 | orchestrator | 2025-09-23 19:07:56 | INFO  | Task 5f6bc9ed-1775-46c7-a32b-02058f20582e (facts) was prepared for execution. 2025-09-23 19:07:56.256358 | orchestrator | 2025-09-23 19:07:56 | INFO  | It takes a moment until task 5f6bc9ed-1775-46c7-a32b-02058f20582e (facts) has been started and output is visible here. 2025-09-23 19:08:09.247404 | orchestrator | 2025-09-23 19:08:09.247537 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-09-23 19:08:09.247557 | orchestrator | 2025-09-23 19:08:09.247569 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-09-23 19:08:09.247581 | orchestrator | Tuesday 23 September 2025 19:08:00 +0000 (0:00:00.277) 0:00:00.277 ***** 2025-09-23 19:08:09.247593 | orchestrator | ok: [testbed-manager] 2025-09-23 19:08:09.247653 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:08:09.247668 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:08:09.247704 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:08:09.247716 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:08:09.247727 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:08:09.247737 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:08:09.247748 | orchestrator | 2025-09-23 19:08:09.247759 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-09-23 19:08:09.247770 | orchestrator | Tuesday 23 September 2025 19:08:01 +0000 (0:00:01.094) 0:00:01.371 ***** 2025-09-23 19:08:09.247781 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:08:09.247793 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:08:09.247803 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:08:09.247814 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:08:09.247825 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:09.247835 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:09.247846 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:09.247857 | orchestrator | 2025-09-23 19:08:09.247868 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-23 19:08:09.247879 | orchestrator | 2025-09-23 19:08:09.247905 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-23 19:08:09.247917 | orchestrator | Tuesday 23 September 2025 19:08:02 +0000 (0:00:01.191) 0:00:02.563 ***** 2025-09-23 19:08:09.247928 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:08:09.247939 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:08:09.247953 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:08:09.247965 | orchestrator | ok: [testbed-manager] 2025-09-23 19:08:09.247978 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:08:09.247991 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:08:09.248003 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:08:09.248016 | orchestrator | 2025-09-23 19:08:09.248029 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-09-23 19:08:09.248042 | orchestrator | 2025-09-23 19:08:09.248054 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-09-23 19:08:09.248067 | orchestrator | Tuesday 23 September 2025 19:08:08 +0000 (0:00:05.871) 0:00:08.434 ***** 2025-09-23 19:08:09.248079 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:08:09.248091 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:08:09.248103 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:08:09.248116 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:08:09.248129 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:09.248141 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:09.248154 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:09.248166 | orchestrator | 2025-09-23 19:08:09.248179 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:08:09.248214 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248228 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248241 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248252 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248263 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248274 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248285 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:08:09.248295 | orchestrator | 2025-09-23 19:08:09.248317 | orchestrator | 2025-09-23 19:08:09.248328 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:08:09.248339 | orchestrator | Tuesday 23 September 2025 19:08:08 +0000 (0:00:00.494) 0:00:08.929 ***** 2025-09-23 19:08:09.248350 | orchestrator | =============================================================================== 2025-09-23 19:08:09.248361 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.87s 2025-09-23 19:08:09.248371 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.19s 2025-09-23 19:08:09.248383 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.09s 2025-09-23 19:08:09.248394 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.50s 2025-09-23 19:08:11.456076 | orchestrator | 2025-09-23 19:08:11 | INFO  | Task 0d9ec496-64e4-4b8b-9393-2d884f7a589b (ceph-configure-lvm-volumes) was prepared for execution. 2025-09-23 19:08:11.456179 | orchestrator | 2025-09-23 19:08:11 | INFO  | It takes a moment until task 0d9ec496-64e4-4b8b-9393-2d884f7a589b (ceph-configure-lvm-volumes) has been started and output is visible here. 2025-09-23 19:08:22.926372 | orchestrator | 2025-09-23 19:08:22.926457 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-09-23 19:08:22.926467 | orchestrator | 2025-09-23 19:08:22.926473 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-23 19:08:22.926480 | orchestrator | Tuesday 23 September 2025 19:08:15 +0000 (0:00:00.313) 0:00:00.314 ***** 2025-09-23 19:08:22.926487 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-09-23 19:08:22.926493 | orchestrator | 2025-09-23 19:08:22.926499 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-23 19:08:22.926505 | orchestrator | Tuesday 23 September 2025 19:08:15 +0000 (0:00:00.240) 0:00:00.554 ***** 2025-09-23 19:08:22.926512 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:08:22.926518 | orchestrator | 2025-09-23 19:08:22.926524 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926530 | orchestrator | Tuesday 23 September 2025 19:08:16 +0000 (0:00:00.227) 0:00:00.781 ***** 2025-09-23 19:08:22.926536 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-09-23 19:08:22.926543 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-09-23 19:08:22.926549 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-09-23 19:08:22.926562 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-09-23 19:08:22.926568 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-09-23 19:08:22.926574 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-09-23 19:08:22.926580 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-09-23 19:08:22.926586 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-09-23 19:08:22.926592 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-09-23 19:08:22.926598 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-09-23 19:08:22.926604 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-09-23 19:08:22.926610 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-09-23 19:08:22.926616 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-09-23 19:08:22.926622 | orchestrator | 2025-09-23 19:08:22.926628 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926634 | orchestrator | Tuesday 23 September 2025 19:08:16 +0000 (0:00:00.373) 0:00:01.154 ***** 2025-09-23 19:08:22.926640 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926664 | orchestrator | 2025-09-23 19:08:22.926674 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926683 | orchestrator | Tuesday 23 September 2025 19:08:16 +0000 (0:00:00.445) 0:00:01.600 ***** 2025-09-23 19:08:22.926691 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926700 | orchestrator | 2025-09-23 19:08:22.926708 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926716 | orchestrator | Tuesday 23 September 2025 19:08:17 +0000 (0:00:00.193) 0:00:01.794 ***** 2025-09-23 19:08:22.926725 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926734 | orchestrator | 2025-09-23 19:08:22.926746 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926756 | orchestrator | Tuesday 23 September 2025 19:08:17 +0000 (0:00:00.194) 0:00:01.988 ***** 2025-09-23 19:08:22.926765 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926776 | orchestrator | 2025-09-23 19:08:22.926785 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926794 | orchestrator | Tuesday 23 September 2025 19:08:17 +0000 (0:00:00.198) 0:00:02.187 ***** 2025-09-23 19:08:22.926804 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926812 | orchestrator | 2025-09-23 19:08:22.926821 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926830 | orchestrator | Tuesday 23 September 2025 19:08:17 +0000 (0:00:00.198) 0:00:02.385 ***** 2025-09-23 19:08:22.926839 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926849 | orchestrator | 2025-09-23 19:08:22.926864 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926875 | orchestrator | Tuesday 23 September 2025 19:08:17 +0000 (0:00:00.205) 0:00:02.590 ***** 2025-09-23 19:08:22.926885 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926894 | orchestrator | 2025-09-23 19:08:22.926903 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926914 | orchestrator | Tuesday 23 September 2025 19:08:18 +0000 (0:00:00.188) 0:00:02.779 ***** 2025-09-23 19:08:22.926924 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.926934 | orchestrator | 2025-09-23 19:08:22.926945 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.926960 | orchestrator | Tuesday 23 September 2025 19:08:18 +0000 (0:00:00.210) 0:00:02.990 ***** 2025-09-23 19:08:22.926972 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37) 2025-09-23 19:08:22.926984 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37) 2025-09-23 19:08:22.926994 | orchestrator | 2025-09-23 19:08:22.927003 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.927012 | orchestrator | Tuesday 23 September 2025 19:08:18 +0000 (0:00:00.398) 0:00:03.388 ***** 2025-09-23 19:08:22.927039 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5) 2025-09-23 19:08:22.927050 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5) 2025-09-23 19:08:22.927060 | orchestrator | 2025-09-23 19:08:22.927071 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.927080 | orchestrator | Tuesday 23 September 2025 19:08:19 +0000 (0:00:00.417) 0:00:03.806 ***** 2025-09-23 19:08:22.927096 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd) 2025-09-23 19:08:22.927105 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd) 2025-09-23 19:08:22.927111 | orchestrator | 2025-09-23 19:08:22.927117 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.927123 | orchestrator | Tuesday 23 September 2025 19:08:19 +0000 (0:00:00.592) 0:00:04.398 ***** 2025-09-23 19:08:22.927129 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6) 2025-09-23 19:08:22.927143 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6) 2025-09-23 19:08:22.927150 | orchestrator | 2025-09-23 19:08:22.927156 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:22.927162 | orchestrator | Tuesday 23 September 2025 19:08:20 +0000 (0:00:00.592) 0:00:04.991 ***** 2025-09-23 19:08:22.927169 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-23 19:08:22.927175 | orchestrator | 2025-09-23 19:08:22.927207 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927213 | orchestrator | Tuesday 23 September 2025 19:08:20 +0000 (0:00:00.685) 0:00:05.676 ***** 2025-09-23 19:08:22.927220 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-09-23 19:08:22.927226 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-09-23 19:08:22.927232 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-09-23 19:08:22.927238 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-09-23 19:08:22.927244 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-09-23 19:08:22.927250 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-09-23 19:08:22.927257 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-09-23 19:08:22.927263 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-09-23 19:08:22.927269 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-09-23 19:08:22.927275 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-09-23 19:08:22.927281 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-09-23 19:08:22.927288 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-09-23 19:08:22.927293 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-09-23 19:08:22.927298 | orchestrator | 2025-09-23 19:08:22.927304 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927309 | orchestrator | Tuesday 23 September 2025 19:08:21 +0000 (0:00:00.382) 0:00:06.059 ***** 2025-09-23 19:08:22.927315 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927320 | orchestrator | 2025-09-23 19:08:22.927325 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927331 | orchestrator | Tuesday 23 September 2025 19:08:21 +0000 (0:00:00.191) 0:00:06.250 ***** 2025-09-23 19:08:22.927336 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927341 | orchestrator | 2025-09-23 19:08:22.927347 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927352 | orchestrator | Tuesday 23 September 2025 19:08:21 +0000 (0:00:00.209) 0:00:06.460 ***** 2025-09-23 19:08:22.927358 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927363 | orchestrator | 2025-09-23 19:08:22.927368 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927374 | orchestrator | Tuesday 23 September 2025 19:08:21 +0000 (0:00:00.224) 0:00:06.684 ***** 2025-09-23 19:08:22.927379 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927384 | orchestrator | 2025-09-23 19:08:22.927390 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927395 | orchestrator | Tuesday 23 September 2025 19:08:22 +0000 (0:00:00.200) 0:00:06.885 ***** 2025-09-23 19:08:22.927401 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927406 | orchestrator | 2025-09-23 19:08:22.927415 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927421 | orchestrator | Tuesday 23 September 2025 19:08:22 +0000 (0:00:00.202) 0:00:07.087 ***** 2025-09-23 19:08:22.927426 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927432 | orchestrator | 2025-09-23 19:08:22.927437 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927443 | orchestrator | Tuesday 23 September 2025 19:08:22 +0000 (0:00:00.190) 0:00:07.278 ***** 2025-09-23 19:08:22.927448 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:22.927453 | orchestrator | 2025-09-23 19:08:22.927459 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:22.927464 | orchestrator | Tuesday 23 September 2025 19:08:22 +0000 (0:00:00.199) 0:00:07.478 ***** 2025-09-23 19:08:22.927475 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242090 | orchestrator | 2025-09-23 19:08:30.242231 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:30.242249 | orchestrator | Tuesday 23 September 2025 19:08:22 +0000 (0:00:00.194) 0:00:07.672 ***** 2025-09-23 19:08:30.242261 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-09-23 19:08:30.242272 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-09-23 19:08:30.242282 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-09-23 19:08:30.242292 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-09-23 19:08:30.242302 | orchestrator | 2025-09-23 19:08:30.242312 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:30.242322 | orchestrator | Tuesday 23 September 2025 19:08:23 +0000 (0:00:01.067) 0:00:08.739 ***** 2025-09-23 19:08:30.242350 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242361 | orchestrator | 2025-09-23 19:08:30.242371 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:30.242380 | orchestrator | Tuesday 23 September 2025 19:08:24 +0000 (0:00:00.208) 0:00:08.948 ***** 2025-09-23 19:08:30.242390 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242400 | orchestrator | 2025-09-23 19:08:30.242410 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:30.242419 | orchestrator | Tuesday 23 September 2025 19:08:24 +0000 (0:00:00.199) 0:00:09.147 ***** 2025-09-23 19:08:30.242429 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242439 | orchestrator | 2025-09-23 19:08:30.242449 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:30.242459 | orchestrator | Tuesday 23 September 2025 19:08:24 +0000 (0:00:00.199) 0:00:09.347 ***** 2025-09-23 19:08:30.242468 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242478 | orchestrator | 2025-09-23 19:08:30.242488 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-09-23 19:08:30.242497 | orchestrator | Tuesday 23 September 2025 19:08:24 +0000 (0:00:00.206) 0:00:09.553 ***** 2025-09-23 19:08:30.242507 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2025-09-23 19:08:30.242517 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2025-09-23 19:08:30.242527 | orchestrator | 2025-09-23 19:08:30.242536 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-09-23 19:08:30.242546 | orchestrator | Tuesday 23 September 2025 19:08:24 +0000 (0:00:00.166) 0:00:09.720 ***** 2025-09-23 19:08:30.242556 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242568 | orchestrator | 2025-09-23 19:08:30.242579 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-09-23 19:08:30.242590 | orchestrator | Tuesday 23 September 2025 19:08:25 +0000 (0:00:00.135) 0:00:09.856 ***** 2025-09-23 19:08:30.242601 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242612 | orchestrator | 2025-09-23 19:08:30.242623 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-09-23 19:08:30.242634 | orchestrator | Tuesday 23 September 2025 19:08:25 +0000 (0:00:00.135) 0:00:09.991 ***** 2025-09-23 19:08:30.242645 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242681 | orchestrator | 2025-09-23 19:08:30.242693 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-09-23 19:08:30.242705 | orchestrator | Tuesday 23 September 2025 19:08:25 +0000 (0:00:00.132) 0:00:10.124 ***** 2025-09-23 19:08:30.242716 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:08:30.242727 | orchestrator | 2025-09-23 19:08:30.242738 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-09-23 19:08:30.242748 | orchestrator | Tuesday 23 September 2025 19:08:25 +0000 (0:00:00.144) 0:00:10.269 ***** 2025-09-23 19:08:30.242761 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}}) 2025-09-23 19:08:30.242773 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'ad3a695b-9edf-562e-89c9-18fadd13d262'}}) 2025-09-23 19:08:30.242784 | orchestrator | 2025-09-23 19:08:30.242795 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-09-23 19:08:30.242806 | orchestrator | Tuesday 23 September 2025 19:08:25 +0000 (0:00:00.175) 0:00:10.444 ***** 2025-09-23 19:08:30.242818 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}})  2025-09-23 19:08:30.242836 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'ad3a695b-9edf-562e-89c9-18fadd13d262'}})  2025-09-23 19:08:30.242848 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242859 | orchestrator | 2025-09-23 19:08:30.242871 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-09-23 19:08:30.242882 | orchestrator | Tuesday 23 September 2025 19:08:25 +0000 (0:00:00.142) 0:00:10.586 ***** 2025-09-23 19:08:30.242893 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}})  2025-09-23 19:08:30.242904 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'ad3a695b-9edf-562e-89c9-18fadd13d262'}})  2025-09-23 19:08:30.242915 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242925 | orchestrator | 2025-09-23 19:08:30.242937 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-09-23 19:08:30.242948 | orchestrator | Tuesday 23 September 2025 19:08:26 +0000 (0:00:00.350) 0:00:10.937 ***** 2025-09-23 19:08:30.242959 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}})  2025-09-23 19:08:30.242969 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'ad3a695b-9edf-562e-89c9-18fadd13d262'}})  2025-09-23 19:08:30.242979 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.242988 | orchestrator | 2025-09-23 19:08:30.243014 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-09-23 19:08:30.243024 | orchestrator | Tuesday 23 September 2025 19:08:26 +0000 (0:00:00.160) 0:00:11.098 ***** 2025-09-23 19:08:30.243034 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:08:30.243043 | orchestrator | 2025-09-23 19:08:30.243053 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-09-23 19:08:30.243062 | orchestrator | Tuesday 23 September 2025 19:08:26 +0000 (0:00:00.150) 0:00:11.248 ***** 2025-09-23 19:08:30.243072 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:08:30.243081 | orchestrator | 2025-09-23 19:08:30.243091 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-09-23 19:08:30.243100 | orchestrator | Tuesday 23 September 2025 19:08:26 +0000 (0:00:00.141) 0:00:11.389 ***** 2025-09-23 19:08:30.243110 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.243119 | orchestrator | 2025-09-23 19:08:30.243129 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-09-23 19:08:30.243138 | orchestrator | Tuesday 23 September 2025 19:08:26 +0000 (0:00:00.147) 0:00:11.537 ***** 2025-09-23 19:08:30.243148 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.243157 | orchestrator | 2025-09-23 19:08:30.243193 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-09-23 19:08:30.243204 | orchestrator | Tuesday 23 September 2025 19:08:26 +0000 (0:00:00.139) 0:00:11.676 ***** 2025-09-23 19:08:30.243213 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.243223 | orchestrator | 2025-09-23 19:08:30.243232 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-09-23 19:08:30.243242 | orchestrator | Tuesday 23 September 2025 19:08:27 +0000 (0:00:00.136) 0:00:11.813 ***** 2025-09-23 19:08:30.243251 | orchestrator | ok: [testbed-node-3] => { 2025-09-23 19:08:30.243261 | orchestrator |  "ceph_osd_devices": { 2025-09-23 19:08:30.243271 | orchestrator |  "sdb": { 2025-09-23 19:08:30.243281 | orchestrator |  "osd_lvm_uuid": "ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e" 2025-09-23 19:08:30.243291 | orchestrator |  }, 2025-09-23 19:08:30.243301 | orchestrator |  "sdc": { 2025-09-23 19:08:30.243311 | orchestrator |  "osd_lvm_uuid": "ad3a695b-9edf-562e-89c9-18fadd13d262" 2025-09-23 19:08:30.243320 | orchestrator |  } 2025-09-23 19:08:30.243330 | orchestrator |  } 2025-09-23 19:08:30.243340 | orchestrator | } 2025-09-23 19:08:30.243350 | orchestrator | 2025-09-23 19:08:30.243359 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-09-23 19:08:30.243369 | orchestrator | Tuesday 23 September 2025 19:08:27 +0000 (0:00:00.156) 0:00:11.970 ***** 2025-09-23 19:08:30.243378 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.243388 | orchestrator | 2025-09-23 19:08:30.243398 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-09-23 19:08:30.243407 | orchestrator | Tuesday 23 September 2025 19:08:27 +0000 (0:00:00.135) 0:00:12.105 ***** 2025-09-23 19:08:30.243421 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.243431 | orchestrator | 2025-09-23 19:08:30.243441 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-09-23 19:08:30.243450 | orchestrator | Tuesday 23 September 2025 19:08:27 +0000 (0:00:00.139) 0:00:12.245 ***** 2025-09-23 19:08:30.243460 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:08:30.243469 | orchestrator | 2025-09-23 19:08:30.243479 | orchestrator | TASK [Print configuration data] ************************************************ 2025-09-23 19:08:30.243488 | orchestrator | Tuesday 23 September 2025 19:08:27 +0000 (0:00:00.134) 0:00:12.379 ***** 2025-09-23 19:08:30.243498 | orchestrator | changed: [testbed-node-3] => { 2025-09-23 19:08:30.243507 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-09-23 19:08:30.243517 | orchestrator |  "ceph_osd_devices": { 2025-09-23 19:08:30.243526 | orchestrator |  "sdb": { 2025-09-23 19:08:30.243536 | orchestrator |  "osd_lvm_uuid": "ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e" 2025-09-23 19:08:30.243545 | orchestrator |  }, 2025-09-23 19:08:30.243555 | orchestrator |  "sdc": { 2025-09-23 19:08:30.243565 | orchestrator |  "osd_lvm_uuid": "ad3a695b-9edf-562e-89c9-18fadd13d262" 2025-09-23 19:08:30.243574 | orchestrator |  } 2025-09-23 19:08:30.243584 | orchestrator |  }, 2025-09-23 19:08:30.243593 | orchestrator |  "lvm_volumes": [ 2025-09-23 19:08:30.243603 | orchestrator |  { 2025-09-23 19:08:30.243612 | orchestrator |  "data": "osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e", 2025-09-23 19:08:30.243622 | orchestrator |  "data_vg": "ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e" 2025-09-23 19:08:30.243632 | orchestrator |  }, 2025-09-23 19:08:30.243641 | orchestrator |  { 2025-09-23 19:08:30.243651 | orchestrator |  "data": "osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262", 2025-09-23 19:08:30.243660 | orchestrator |  "data_vg": "ceph-ad3a695b-9edf-562e-89c9-18fadd13d262" 2025-09-23 19:08:30.243670 | orchestrator |  } 2025-09-23 19:08:30.243679 | orchestrator |  ] 2025-09-23 19:08:30.243689 | orchestrator |  } 2025-09-23 19:08:30.243699 | orchestrator | } 2025-09-23 19:08:30.243708 | orchestrator | 2025-09-23 19:08:30.243718 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-09-23 19:08:30.243733 | orchestrator | Tuesday 23 September 2025 19:08:27 +0000 (0:00:00.375) 0:00:12.754 ***** 2025-09-23 19:08:30.243756 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-09-23 19:08:30.243766 | orchestrator | 2025-09-23 19:08:30.243775 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-09-23 19:08:30.243785 | orchestrator | 2025-09-23 19:08:30.243794 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-23 19:08:30.243804 | orchestrator | Tuesday 23 September 2025 19:08:29 +0000 (0:00:01.752) 0:00:14.506 ***** 2025-09-23 19:08:30.243813 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-09-23 19:08:30.243823 | orchestrator | 2025-09-23 19:08:30.243832 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-23 19:08:30.243842 | orchestrator | Tuesday 23 September 2025 19:08:30 +0000 (0:00:00.250) 0:00:14.756 ***** 2025-09-23 19:08:30.243852 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:08:30.243861 | orchestrator | 2025-09-23 19:08:30.243871 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:30.243887 | orchestrator | Tuesday 23 September 2025 19:08:30 +0000 (0:00:00.233) 0:00:14.989 ***** 2025-09-23 19:08:38.089403 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-09-23 19:08:38.089518 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-09-23 19:08:38.089534 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-09-23 19:08:38.089546 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-09-23 19:08:38.089557 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-09-23 19:08:38.089567 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-09-23 19:08:38.089578 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-09-23 19:08:38.089589 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-09-23 19:08:38.089600 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-09-23 19:08:38.089610 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-09-23 19:08:38.089640 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-09-23 19:08:38.089652 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-09-23 19:08:38.089663 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-09-23 19:08:38.089678 | orchestrator | 2025-09-23 19:08:38.089690 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089703 | orchestrator | Tuesday 23 September 2025 19:08:30 +0000 (0:00:00.381) 0:00:15.371 ***** 2025-09-23 19:08:38.089714 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.089726 | orchestrator | 2025-09-23 19:08:38.089737 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089748 | orchestrator | Tuesday 23 September 2025 19:08:30 +0000 (0:00:00.203) 0:00:15.575 ***** 2025-09-23 19:08:38.089758 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.089769 | orchestrator | 2025-09-23 19:08:38.089780 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089791 | orchestrator | Tuesday 23 September 2025 19:08:31 +0000 (0:00:00.196) 0:00:15.771 ***** 2025-09-23 19:08:38.089802 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.089812 | orchestrator | 2025-09-23 19:08:38.089823 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089834 | orchestrator | Tuesday 23 September 2025 19:08:31 +0000 (0:00:00.195) 0:00:15.967 ***** 2025-09-23 19:08:38.089845 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.089880 | orchestrator | 2025-09-23 19:08:38.089892 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089902 | orchestrator | Tuesday 23 September 2025 19:08:31 +0000 (0:00:00.177) 0:00:16.145 ***** 2025-09-23 19:08:38.089913 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.089924 | orchestrator | 2025-09-23 19:08:38.089937 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089950 | orchestrator | Tuesday 23 September 2025 19:08:31 +0000 (0:00:00.573) 0:00:16.718 ***** 2025-09-23 19:08:38.089962 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.089974 | orchestrator | 2025-09-23 19:08:38.089986 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.089999 | orchestrator | Tuesday 23 September 2025 19:08:32 +0000 (0:00:00.198) 0:00:16.916 ***** 2025-09-23 19:08:38.090069 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090084 | orchestrator | 2025-09-23 19:08:38.090097 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.090110 | orchestrator | Tuesday 23 September 2025 19:08:32 +0000 (0:00:00.184) 0:00:17.101 ***** 2025-09-23 19:08:38.090122 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090134 | orchestrator | 2025-09-23 19:08:38.090146 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.090159 | orchestrator | Tuesday 23 September 2025 19:08:32 +0000 (0:00:00.207) 0:00:17.309 ***** 2025-09-23 19:08:38.090191 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f) 2025-09-23 19:08:38.090206 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f) 2025-09-23 19:08:38.090218 | orchestrator | 2025-09-23 19:08:38.090230 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.090243 | orchestrator | Tuesday 23 September 2025 19:08:32 +0000 (0:00:00.406) 0:00:17.715 ***** 2025-09-23 19:08:38.090255 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85) 2025-09-23 19:08:38.090268 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85) 2025-09-23 19:08:38.090281 | orchestrator | 2025-09-23 19:08:38.090294 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.090307 | orchestrator | Tuesday 23 September 2025 19:08:33 +0000 (0:00:00.414) 0:00:18.129 ***** 2025-09-23 19:08:38.090319 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd) 2025-09-23 19:08:38.090332 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd) 2025-09-23 19:08:38.090343 | orchestrator | 2025-09-23 19:08:38.090354 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.090365 | orchestrator | Tuesday 23 September 2025 19:08:33 +0000 (0:00:00.426) 0:00:18.556 ***** 2025-09-23 19:08:38.090394 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8) 2025-09-23 19:08:38.090406 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8) 2025-09-23 19:08:38.090417 | orchestrator | 2025-09-23 19:08:38.090428 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:38.090439 | orchestrator | Tuesday 23 September 2025 19:08:34 +0000 (0:00:00.451) 0:00:19.007 ***** 2025-09-23 19:08:38.090450 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-23 19:08:38.090461 | orchestrator | 2025-09-23 19:08:38.090472 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090489 | orchestrator | Tuesday 23 September 2025 19:08:34 +0000 (0:00:00.331) 0:00:19.338 ***** 2025-09-23 19:08:38.090501 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-09-23 19:08:38.090521 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-09-23 19:08:38.090531 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-09-23 19:08:38.090542 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-09-23 19:08:38.090553 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-09-23 19:08:38.090564 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-09-23 19:08:38.090574 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-09-23 19:08:38.090585 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-09-23 19:08:38.090596 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-09-23 19:08:38.090607 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-09-23 19:08:38.090617 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-09-23 19:08:38.090628 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-09-23 19:08:38.090639 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-09-23 19:08:38.090649 | orchestrator | 2025-09-23 19:08:38.090660 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090671 | orchestrator | Tuesday 23 September 2025 19:08:34 +0000 (0:00:00.383) 0:00:19.722 ***** 2025-09-23 19:08:38.090681 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090692 | orchestrator | 2025-09-23 19:08:38.090703 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090714 | orchestrator | Tuesday 23 September 2025 19:08:35 +0000 (0:00:00.203) 0:00:19.925 ***** 2025-09-23 19:08:38.090724 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090735 | orchestrator | 2025-09-23 19:08:38.090746 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090757 | orchestrator | Tuesday 23 September 2025 19:08:35 +0000 (0:00:00.583) 0:00:20.509 ***** 2025-09-23 19:08:38.090767 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090778 | orchestrator | 2025-09-23 19:08:38.090789 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090800 | orchestrator | Tuesday 23 September 2025 19:08:35 +0000 (0:00:00.195) 0:00:20.704 ***** 2025-09-23 19:08:38.090810 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090821 | orchestrator | 2025-09-23 19:08:38.090832 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090843 | orchestrator | Tuesday 23 September 2025 19:08:36 +0000 (0:00:00.194) 0:00:20.898 ***** 2025-09-23 19:08:38.090854 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090864 | orchestrator | 2025-09-23 19:08:38.090875 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090886 | orchestrator | Tuesday 23 September 2025 19:08:36 +0000 (0:00:00.217) 0:00:21.116 ***** 2025-09-23 19:08:38.090897 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090907 | orchestrator | 2025-09-23 19:08:38.090918 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090929 | orchestrator | Tuesday 23 September 2025 19:08:36 +0000 (0:00:00.276) 0:00:21.392 ***** 2025-09-23 19:08:38.090940 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090950 | orchestrator | 2025-09-23 19:08:38.090961 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.090972 | orchestrator | Tuesday 23 September 2025 19:08:36 +0000 (0:00:00.208) 0:00:21.601 ***** 2025-09-23 19:08:38.090983 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.090993 | orchestrator | 2025-09-23 19:08:38.091004 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.091023 | orchestrator | Tuesday 23 September 2025 19:08:37 +0000 (0:00:00.225) 0:00:21.826 ***** 2025-09-23 19:08:38.091034 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-09-23 19:08:38.091045 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-09-23 19:08:38.091056 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-09-23 19:08:38.091067 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-09-23 19:08:38.091077 | orchestrator | 2025-09-23 19:08:38.091088 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:38.091099 | orchestrator | Tuesday 23 September 2025 19:08:37 +0000 (0:00:00.816) 0:00:22.642 ***** 2025-09-23 19:08:38.091110 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:38.091121 | orchestrator | 2025-09-23 19:08:38.091138 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:43.809681 | orchestrator | Tuesday 23 September 2025 19:08:38 +0000 (0:00:00.192) 0:00:22.835 ***** 2025-09-23 19:08:43.809773 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.809789 | orchestrator | 2025-09-23 19:08:43.809802 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:43.809813 | orchestrator | Tuesday 23 September 2025 19:08:38 +0000 (0:00:00.189) 0:00:23.025 ***** 2025-09-23 19:08:43.809824 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.809835 | orchestrator | 2025-09-23 19:08:43.809846 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:43.809857 | orchestrator | Tuesday 23 September 2025 19:08:38 +0000 (0:00:00.188) 0:00:23.213 ***** 2025-09-23 19:08:43.809868 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.809878 | orchestrator | 2025-09-23 19:08:43.809904 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-09-23 19:08:43.809916 | orchestrator | Tuesday 23 September 2025 19:08:38 +0000 (0:00:00.194) 0:00:23.408 ***** 2025-09-23 19:08:43.809927 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2025-09-23 19:08:43.809938 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2025-09-23 19:08:43.809948 | orchestrator | 2025-09-23 19:08:43.809959 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-09-23 19:08:43.809970 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.349) 0:00:23.757 ***** 2025-09-23 19:08:43.809980 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.809991 | orchestrator | 2025-09-23 19:08:43.810002 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-09-23 19:08:43.810069 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.138) 0:00:23.896 ***** 2025-09-23 19:08:43.810084 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810095 | orchestrator | 2025-09-23 19:08:43.810106 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-09-23 19:08:43.810116 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.137) 0:00:24.034 ***** 2025-09-23 19:08:43.810127 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810138 | orchestrator | 2025-09-23 19:08:43.810149 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-09-23 19:08:43.810159 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.134) 0:00:24.168 ***** 2025-09-23 19:08:43.810207 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:08:43.810219 | orchestrator | 2025-09-23 19:08:43.810231 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-09-23 19:08:43.810244 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.136) 0:00:24.304 ***** 2025-09-23 19:08:43.810257 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1c8984fd-f811-541c-8648-d34ada8a5304'}}) 2025-09-23 19:08:43.810270 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '8028f60e-1a44-5536-9db2-40f94e230aee'}}) 2025-09-23 19:08:43.810282 | orchestrator | 2025-09-23 19:08:43.810295 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-09-23 19:08:43.810338 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.160) 0:00:24.464 ***** 2025-09-23 19:08:43.810352 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1c8984fd-f811-541c-8648-d34ada8a5304'}})  2025-09-23 19:08:43.810365 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '8028f60e-1a44-5536-9db2-40f94e230aee'}})  2025-09-23 19:08:43.810377 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810390 | orchestrator | 2025-09-23 19:08:43.810402 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-09-23 19:08:43.810414 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.135) 0:00:24.600 ***** 2025-09-23 19:08:43.810426 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1c8984fd-f811-541c-8648-d34ada8a5304'}})  2025-09-23 19:08:43.810438 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '8028f60e-1a44-5536-9db2-40f94e230aee'}})  2025-09-23 19:08:43.810450 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810462 | orchestrator | 2025-09-23 19:08:43.810474 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-09-23 19:08:43.810486 | orchestrator | Tuesday 23 September 2025 19:08:39 +0000 (0:00:00.146) 0:00:24.746 ***** 2025-09-23 19:08:43.810498 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1c8984fd-f811-541c-8648-d34ada8a5304'}})  2025-09-23 19:08:43.810510 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '8028f60e-1a44-5536-9db2-40f94e230aee'}})  2025-09-23 19:08:43.810524 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810535 | orchestrator | 2025-09-23 19:08:43.810547 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-09-23 19:08:43.810559 | orchestrator | Tuesday 23 September 2025 19:08:40 +0000 (0:00:00.151) 0:00:24.898 ***** 2025-09-23 19:08:43.810571 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:08:43.810583 | orchestrator | 2025-09-23 19:08:43.810595 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-09-23 19:08:43.810606 | orchestrator | Tuesday 23 September 2025 19:08:40 +0000 (0:00:00.122) 0:00:25.020 ***** 2025-09-23 19:08:43.810616 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:08:43.810627 | orchestrator | 2025-09-23 19:08:43.810637 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-09-23 19:08:43.810648 | orchestrator | Tuesday 23 September 2025 19:08:40 +0000 (0:00:00.127) 0:00:25.148 ***** 2025-09-23 19:08:43.810659 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810669 | orchestrator | 2025-09-23 19:08:43.810698 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-09-23 19:08:43.810710 | orchestrator | Tuesday 23 September 2025 19:08:40 +0000 (0:00:00.134) 0:00:25.282 ***** 2025-09-23 19:08:43.810720 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810731 | orchestrator | 2025-09-23 19:08:43.810742 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-09-23 19:08:43.810753 | orchestrator | Tuesday 23 September 2025 19:08:40 +0000 (0:00:00.334) 0:00:25.617 ***** 2025-09-23 19:08:43.810763 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810774 | orchestrator | 2025-09-23 19:08:43.810784 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-09-23 19:08:43.810795 | orchestrator | Tuesday 23 September 2025 19:08:40 +0000 (0:00:00.128) 0:00:25.746 ***** 2025-09-23 19:08:43.810805 | orchestrator | ok: [testbed-node-4] => { 2025-09-23 19:08:43.810816 | orchestrator |  "ceph_osd_devices": { 2025-09-23 19:08:43.810827 | orchestrator |  "sdb": { 2025-09-23 19:08:43.810839 | orchestrator |  "osd_lvm_uuid": "1c8984fd-f811-541c-8648-d34ada8a5304" 2025-09-23 19:08:43.810850 | orchestrator |  }, 2025-09-23 19:08:43.810861 | orchestrator |  "sdc": { 2025-09-23 19:08:43.810879 | orchestrator |  "osd_lvm_uuid": "8028f60e-1a44-5536-9db2-40f94e230aee" 2025-09-23 19:08:43.810890 | orchestrator |  } 2025-09-23 19:08:43.810900 | orchestrator |  } 2025-09-23 19:08:43.810911 | orchestrator | } 2025-09-23 19:08:43.810922 | orchestrator | 2025-09-23 19:08:43.810933 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-09-23 19:08:43.810944 | orchestrator | Tuesday 23 September 2025 19:08:41 +0000 (0:00:00.134) 0:00:25.880 ***** 2025-09-23 19:08:43.810954 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.810965 | orchestrator | 2025-09-23 19:08:43.810982 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-09-23 19:08:43.810993 | orchestrator | Tuesday 23 September 2025 19:08:41 +0000 (0:00:00.128) 0:00:26.008 ***** 2025-09-23 19:08:43.811003 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.811014 | orchestrator | 2025-09-23 19:08:43.811024 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-09-23 19:08:43.811035 | orchestrator | Tuesday 23 September 2025 19:08:41 +0000 (0:00:00.126) 0:00:26.134 ***** 2025-09-23 19:08:43.811046 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:08:43.811056 | orchestrator | 2025-09-23 19:08:43.811067 | orchestrator | TASK [Print configuration data] ************************************************ 2025-09-23 19:08:43.811078 | orchestrator | Tuesday 23 September 2025 19:08:41 +0000 (0:00:00.139) 0:00:26.274 ***** 2025-09-23 19:08:43.811088 | orchestrator | changed: [testbed-node-4] => { 2025-09-23 19:08:43.811099 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-09-23 19:08:43.811109 | orchestrator |  "ceph_osd_devices": { 2025-09-23 19:08:43.811120 | orchestrator |  "sdb": { 2025-09-23 19:08:43.811131 | orchestrator |  "osd_lvm_uuid": "1c8984fd-f811-541c-8648-d34ada8a5304" 2025-09-23 19:08:43.811145 | orchestrator |  }, 2025-09-23 19:08:43.811156 | orchestrator |  "sdc": { 2025-09-23 19:08:43.811193 | orchestrator |  "osd_lvm_uuid": "8028f60e-1a44-5536-9db2-40f94e230aee" 2025-09-23 19:08:43.811206 | orchestrator |  } 2025-09-23 19:08:43.811217 | orchestrator |  }, 2025-09-23 19:08:43.811227 | orchestrator |  "lvm_volumes": [ 2025-09-23 19:08:43.811238 | orchestrator |  { 2025-09-23 19:08:43.811249 | orchestrator |  "data": "osd-block-1c8984fd-f811-541c-8648-d34ada8a5304", 2025-09-23 19:08:43.811260 | orchestrator |  "data_vg": "ceph-1c8984fd-f811-541c-8648-d34ada8a5304" 2025-09-23 19:08:43.811270 | orchestrator |  }, 2025-09-23 19:08:43.811281 | orchestrator |  { 2025-09-23 19:08:43.811291 | orchestrator |  "data": "osd-block-8028f60e-1a44-5536-9db2-40f94e230aee", 2025-09-23 19:08:43.811302 | orchestrator |  "data_vg": "ceph-8028f60e-1a44-5536-9db2-40f94e230aee" 2025-09-23 19:08:43.811313 | orchestrator |  } 2025-09-23 19:08:43.811323 | orchestrator |  ] 2025-09-23 19:08:43.811334 | orchestrator |  } 2025-09-23 19:08:43.811345 | orchestrator | } 2025-09-23 19:08:43.811355 | orchestrator | 2025-09-23 19:08:43.811366 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-09-23 19:08:43.811376 | orchestrator | Tuesday 23 September 2025 19:08:41 +0000 (0:00:00.202) 0:00:26.476 ***** 2025-09-23 19:08:43.811387 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-09-23 19:08:43.811398 | orchestrator | 2025-09-23 19:08:43.811412 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-09-23 19:08:43.811431 | orchestrator | 2025-09-23 19:08:43.811450 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-23 19:08:43.811474 | orchestrator | Tuesday 23 September 2025 19:08:42 +0000 (0:00:00.915) 0:00:27.391 ***** 2025-09-23 19:08:43.811500 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-09-23 19:08:43.811518 | orchestrator | 2025-09-23 19:08:43.811537 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-23 19:08:43.811554 | orchestrator | Tuesday 23 September 2025 19:08:42 +0000 (0:00:00.358) 0:00:27.750 ***** 2025-09-23 19:08:43.811587 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:08:43.811609 | orchestrator | 2025-09-23 19:08:43.811629 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:43.811649 | orchestrator | Tuesday 23 September 2025 19:08:43 +0000 (0:00:00.515) 0:00:28.265 ***** 2025-09-23 19:08:43.811663 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-09-23 19:08:43.811673 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-09-23 19:08:43.811684 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-09-23 19:08:43.811695 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-09-23 19:08:43.811705 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-09-23 19:08:43.811716 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-09-23 19:08:43.811736 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-09-23 19:08:50.747303 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-09-23 19:08:50.747388 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-09-23 19:08:50.747402 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-09-23 19:08:50.747420 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-09-23 19:08:50.747437 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-09-23 19:08:50.747454 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-09-23 19:08:50.747472 | orchestrator | 2025-09-23 19:08:50.747490 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747502 | orchestrator | Tuesday 23 September 2025 19:08:43 +0000 (0:00:00.286) 0:00:28.552 ***** 2025-09-23 19:08:50.747511 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747521 | orchestrator | 2025-09-23 19:08:50.747531 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747541 | orchestrator | Tuesday 23 September 2025 19:08:43 +0000 (0:00:00.176) 0:00:28.728 ***** 2025-09-23 19:08:50.747551 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747560 | orchestrator | 2025-09-23 19:08:50.747570 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747579 | orchestrator | Tuesday 23 September 2025 19:08:44 +0000 (0:00:00.168) 0:00:28.897 ***** 2025-09-23 19:08:50.747589 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747599 | orchestrator | 2025-09-23 19:08:50.747608 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747618 | orchestrator | Tuesday 23 September 2025 19:08:44 +0000 (0:00:00.168) 0:00:29.066 ***** 2025-09-23 19:08:50.747628 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747637 | orchestrator | 2025-09-23 19:08:50.747647 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747657 | orchestrator | Tuesday 23 September 2025 19:08:44 +0000 (0:00:00.143) 0:00:29.210 ***** 2025-09-23 19:08:50.747666 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747676 | orchestrator | 2025-09-23 19:08:50.747685 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747695 | orchestrator | Tuesday 23 September 2025 19:08:44 +0000 (0:00:00.163) 0:00:29.373 ***** 2025-09-23 19:08:50.747705 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747714 | orchestrator | 2025-09-23 19:08:50.747724 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747733 | orchestrator | Tuesday 23 September 2025 19:08:44 +0000 (0:00:00.138) 0:00:29.511 ***** 2025-09-23 19:08:50.747743 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747771 | orchestrator | 2025-09-23 19:08:50.747782 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747791 | orchestrator | Tuesday 23 September 2025 19:08:44 +0000 (0:00:00.142) 0:00:29.654 ***** 2025-09-23 19:08:50.747801 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.747810 | orchestrator | 2025-09-23 19:08:50.747834 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747844 | orchestrator | Tuesday 23 September 2025 19:08:45 +0000 (0:00:00.133) 0:00:29.787 ***** 2025-09-23 19:08:50.747854 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705) 2025-09-23 19:08:50.747864 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705) 2025-09-23 19:08:50.747874 | orchestrator | 2025-09-23 19:08:50.747883 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747893 | orchestrator | Tuesday 23 September 2025 19:08:45 +0000 (0:00:00.452) 0:00:30.239 ***** 2025-09-23 19:08:50.747903 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b) 2025-09-23 19:08:50.747912 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b) 2025-09-23 19:08:50.747922 | orchestrator | 2025-09-23 19:08:50.747931 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747941 | orchestrator | Tuesday 23 September 2025 19:08:46 +0000 (0:00:00.561) 0:00:30.801 ***** 2025-09-23 19:08:50.747950 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052) 2025-09-23 19:08:50.747960 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052) 2025-09-23 19:08:50.747970 | orchestrator | 2025-09-23 19:08:50.747979 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.747989 | orchestrator | Tuesday 23 September 2025 19:08:46 +0000 (0:00:00.392) 0:00:31.193 ***** 2025-09-23 19:08:50.747998 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e) 2025-09-23 19:08:50.748008 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e) 2025-09-23 19:08:50.748017 | orchestrator | 2025-09-23 19:08:50.748027 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:08:50.748036 | orchestrator | Tuesday 23 September 2025 19:08:46 +0000 (0:00:00.377) 0:00:31.570 ***** 2025-09-23 19:08:50.748046 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-23 19:08:50.748055 | orchestrator | 2025-09-23 19:08:50.748065 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748074 | orchestrator | Tuesday 23 September 2025 19:08:47 +0000 (0:00:00.297) 0:00:31.867 ***** 2025-09-23 19:08:50.748100 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-09-23 19:08:50.748110 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-09-23 19:08:50.748119 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-09-23 19:08:50.748129 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-09-23 19:08:50.748138 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-09-23 19:08:50.748148 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-09-23 19:08:50.748157 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-09-23 19:08:50.748187 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-09-23 19:08:50.748197 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-09-23 19:08:50.748214 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-09-23 19:08:50.748224 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-09-23 19:08:50.748233 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-09-23 19:08:50.748243 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-09-23 19:08:50.748252 | orchestrator | 2025-09-23 19:08:50.748262 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748271 | orchestrator | Tuesday 23 September 2025 19:08:47 +0000 (0:00:00.331) 0:00:32.199 ***** 2025-09-23 19:08:50.748281 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748290 | orchestrator | 2025-09-23 19:08:50.748300 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748309 | orchestrator | Tuesday 23 September 2025 19:08:47 +0000 (0:00:00.183) 0:00:32.382 ***** 2025-09-23 19:08:50.748319 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748328 | orchestrator | 2025-09-23 19:08:50.748338 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748347 | orchestrator | Tuesday 23 September 2025 19:08:47 +0000 (0:00:00.188) 0:00:32.570 ***** 2025-09-23 19:08:50.748357 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748366 | orchestrator | 2025-09-23 19:08:50.748376 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748386 | orchestrator | Tuesday 23 September 2025 19:08:48 +0000 (0:00:00.190) 0:00:32.761 ***** 2025-09-23 19:08:50.748395 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748404 | orchestrator | 2025-09-23 19:08:50.748414 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748424 | orchestrator | Tuesday 23 September 2025 19:08:48 +0000 (0:00:00.205) 0:00:32.966 ***** 2025-09-23 19:08:50.748443 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748460 | orchestrator | 2025-09-23 19:08:50.748477 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748511 | orchestrator | Tuesday 23 September 2025 19:08:48 +0000 (0:00:00.188) 0:00:33.154 ***** 2025-09-23 19:08:50.748535 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748545 | orchestrator | 2025-09-23 19:08:50.748555 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748565 | orchestrator | Tuesday 23 September 2025 19:08:48 +0000 (0:00:00.511) 0:00:33.665 ***** 2025-09-23 19:08:50.748574 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748584 | orchestrator | 2025-09-23 19:08:50.748593 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748603 | orchestrator | Tuesday 23 September 2025 19:08:49 +0000 (0:00:00.219) 0:00:33.885 ***** 2025-09-23 19:08:50.748612 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748622 | orchestrator | 2025-09-23 19:08:50.748631 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748641 | orchestrator | Tuesday 23 September 2025 19:08:49 +0000 (0:00:00.193) 0:00:34.078 ***** 2025-09-23 19:08:50.748650 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-09-23 19:08:50.748660 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-09-23 19:08:50.748669 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-09-23 19:08:50.748679 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-09-23 19:08:50.748688 | orchestrator | 2025-09-23 19:08:50.748698 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748707 | orchestrator | Tuesday 23 September 2025 19:08:50 +0000 (0:00:00.685) 0:00:34.763 ***** 2025-09-23 19:08:50.748717 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748726 | orchestrator | 2025-09-23 19:08:50.748735 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748752 | orchestrator | Tuesday 23 September 2025 19:08:50 +0000 (0:00:00.200) 0:00:34.964 ***** 2025-09-23 19:08:50.748761 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748775 | orchestrator | 2025-09-23 19:08:50.748792 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748809 | orchestrator | Tuesday 23 September 2025 19:08:50 +0000 (0:00:00.179) 0:00:35.143 ***** 2025-09-23 19:08:50.748826 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748837 | orchestrator | 2025-09-23 19:08:50.748846 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:08:50.748856 | orchestrator | Tuesday 23 September 2025 19:08:50 +0000 (0:00:00.176) 0:00:35.320 ***** 2025-09-23 19:08:50.748871 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:50.748881 | orchestrator | 2025-09-23 19:08:50.748891 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-09-23 19:08:50.748907 | orchestrator | Tuesday 23 September 2025 19:08:50 +0000 (0:00:00.175) 0:00:35.495 ***** 2025-09-23 19:08:54.744681 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2025-09-23 19:08:54.744781 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2025-09-23 19:08:54.744796 | orchestrator | 2025-09-23 19:08:54.744809 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-09-23 19:08:54.744820 | orchestrator | Tuesday 23 September 2025 19:08:50 +0000 (0:00:00.161) 0:00:35.657 ***** 2025-09-23 19:08:54.744831 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.744843 | orchestrator | 2025-09-23 19:08:54.744853 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-09-23 19:08:54.744864 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.133) 0:00:35.791 ***** 2025-09-23 19:08:54.744875 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.744886 | orchestrator | 2025-09-23 19:08:54.744896 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-09-23 19:08:54.744907 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.130) 0:00:35.921 ***** 2025-09-23 19:08:54.744918 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.744928 | orchestrator | 2025-09-23 19:08:54.744939 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-09-23 19:08:54.744949 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.121) 0:00:36.043 ***** 2025-09-23 19:08:54.744960 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:08:54.744971 | orchestrator | 2025-09-23 19:08:54.744982 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-09-23 19:08:54.744993 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.259) 0:00:36.302 ***** 2025-09-23 19:08:54.745005 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}}) 2025-09-23 19:08:54.745016 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}}) 2025-09-23 19:08:54.745027 | orchestrator | 2025-09-23 19:08:54.745038 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-09-23 19:08:54.745049 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.139) 0:00:36.442 ***** 2025-09-23 19:08:54.745060 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}})  2025-09-23 19:08:54.745072 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}})  2025-09-23 19:08:54.745083 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745094 | orchestrator | 2025-09-23 19:08:54.745122 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-09-23 19:08:54.745134 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.139) 0:00:36.581 ***** 2025-09-23 19:08:54.745144 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}})  2025-09-23 19:08:54.745274 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}})  2025-09-23 19:08:54.745289 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745301 | orchestrator | 2025-09-23 19:08:54.745314 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-09-23 19:08:54.745326 | orchestrator | Tuesday 23 September 2025 19:08:51 +0000 (0:00:00.155) 0:00:36.737 ***** 2025-09-23 19:08:54.745338 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}})  2025-09-23 19:08:54.745350 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}})  2025-09-23 19:08:54.745362 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745374 | orchestrator | 2025-09-23 19:08:54.745387 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-09-23 19:08:54.745399 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.168) 0:00:36.905 ***** 2025-09-23 19:08:54.745412 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:08:54.745424 | orchestrator | 2025-09-23 19:08:54.745436 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-09-23 19:08:54.745449 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.138) 0:00:37.044 ***** 2025-09-23 19:08:54.745461 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:08:54.745473 | orchestrator | 2025-09-23 19:08:54.745485 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-09-23 19:08:54.745497 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.134) 0:00:37.178 ***** 2025-09-23 19:08:54.745510 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745522 | orchestrator | 2025-09-23 19:08:54.745534 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-09-23 19:08:54.745545 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.139) 0:00:37.317 ***** 2025-09-23 19:08:54.745558 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745570 | orchestrator | 2025-09-23 19:08:54.745581 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-09-23 19:08:54.745592 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.149) 0:00:37.466 ***** 2025-09-23 19:08:54.745602 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745613 | orchestrator | 2025-09-23 19:08:54.745623 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-09-23 19:08:54.745634 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.136) 0:00:37.603 ***** 2025-09-23 19:08:54.745644 | orchestrator | ok: [testbed-node-5] => { 2025-09-23 19:08:54.745655 | orchestrator |  "ceph_osd_devices": { 2025-09-23 19:08:54.745666 | orchestrator |  "sdb": { 2025-09-23 19:08:54.745677 | orchestrator |  "osd_lvm_uuid": "ecd11808-f35b-5e5a-be1d-5423ee6ce3c5" 2025-09-23 19:08:54.745707 | orchestrator |  }, 2025-09-23 19:08:54.745719 | orchestrator |  "sdc": { 2025-09-23 19:08:54.745730 | orchestrator |  "osd_lvm_uuid": "a2ccb3fa-3e8c-5172-95cb-7cae39233d42" 2025-09-23 19:08:54.745740 | orchestrator |  } 2025-09-23 19:08:54.745751 | orchestrator |  } 2025-09-23 19:08:54.745763 | orchestrator | } 2025-09-23 19:08:54.745774 | orchestrator | 2025-09-23 19:08:54.745785 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-09-23 19:08:54.745795 | orchestrator | Tuesday 23 September 2025 19:08:52 +0000 (0:00:00.137) 0:00:37.741 ***** 2025-09-23 19:08:54.745806 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745817 | orchestrator | 2025-09-23 19:08:54.745827 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-09-23 19:08:54.745838 | orchestrator | Tuesday 23 September 2025 19:08:53 +0000 (0:00:00.123) 0:00:37.864 ***** 2025-09-23 19:08:54.745848 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745859 | orchestrator | 2025-09-23 19:08:54.745870 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-09-23 19:08:54.745888 | orchestrator | Tuesday 23 September 2025 19:08:53 +0000 (0:00:00.327) 0:00:38.192 ***** 2025-09-23 19:08:54.745899 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:08:54.745910 | orchestrator | 2025-09-23 19:08:54.745920 | orchestrator | TASK [Print configuration data] ************************************************ 2025-09-23 19:08:54.745931 | orchestrator | Tuesday 23 September 2025 19:08:53 +0000 (0:00:00.121) 0:00:38.314 ***** 2025-09-23 19:08:54.745941 | orchestrator | changed: [testbed-node-5] => { 2025-09-23 19:08:54.745952 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-09-23 19:08:54.745963 | orchestrator |  "ceph_osd_devices": { 2025-09-23 19:08:54.745974 | orchestrator |  "sdb": { 2025-09-23 19:08:54.745984 | orchestrator |  "osd_lvm_uuid": "ecd11808-f35b-5e5a-be1d-5423ee6ce3c5" 2025-09-23 19:08:54.745995 | orchestrator |  }, 2025-09-23 19:08:54.746006 | orchestrator |  "sdc": { 2025-09-23 19:08:54.746124 | orchestrator |  "osd_lvm_uuid": "a2ccb3fa-3e8c-5172-95cb-7cae39233d42" 2025-09-23 19:08:54.746155 | orchestrator |  } 2025-09-23 19:08:54.746213 | orchestrator |  }, 2025-09-23 19:08:54.746225 | orchestrator |  "lvm_volumes": [ 2025-09-23 19:08:54.746236 | orchestrator |  { 2025-09-23 19:08:54.746247 | orchestrator |  "data": "osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5", 2025-09-23 19:08:54.746258 | orchestrator |  "data_vg": "ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5" 2025-09-23 19:08:54.746269 | orchestrator |  }, 2025-09-23 19:08:54.746279 | orchestrator |  { 2025-09-23 19:08:54.746290 | orchestrator |  "data": "osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42", 2025-09-23 19:08:54.746301 | orchestrator |  "data_vg": "ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42" 2025-09-23 19:08:54.746312 | orchestrator |  } 2025-09-23 19:08:54.746323 | orchestrator |  ] 2025-09-23 19:08:54.746334 | orchestrator |  } 2025-09-23 19:08:54.746349 | orchestrator | } 2025-09-23 19:08:54.746360 | orchestrator | 2025-09-23 19:08:54.746371 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-09-23 19:08:54.746382 | orchestrator | Tuesday 23 September 2025 19:08:53 +0000 (0:00:00.202) 0:00:38.517 ***** 2025-09-23 19:08:54.746392 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-09-23 19:08:54.746403 | orchestrator | 2025-09-23 19:08:54.746414 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:08:54.746436 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-09-23 19:08:54.746449 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-09-23 19:08:54.746460 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-09-23 19:08:54.746471 | orchestrator | 2025-09-23 19:08:54.746482 | orchestrator | 2025-09-23 19:08:54.746492 | orchestrator | 2025-09-23 19:08:54.746503 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:08:54.746514 | orchestrator | Tuesday 23 September 2025 19:08:54 +0000 (0:00:00.959) 0:00:39.477 ***** 2025-09-23 19:08:54.746525 | orchestrator | =============================================================================== 2025-09-23 19:08:54.746535 | orchestrator | Write configuration file ------------------------------------------------ 3.63s 2025-09-23 19:08:54.746546 | orchestrator | Add known partitions to the list of available block devices ------------- 1.10s 2025-09-23 19:08:54.746557 | orchestrator | Add known partitions to the list of available block devices ------------- 1.07s 2025-09-23 19:08:54.746567 | orchestrator | Add known links to the list of available block devices ------------------ 1.04s 2025-09-23 19:08:54.746578 | orchestrator | Get initial list of available block devices ----------------------------- 0.98s 2025-09-23 19:08:54.746599 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.85s 2025-09-23 19:08:54.746610 | orchestrator | Add known partitions to the list of available block devices ------------- 0.82s 2025-09-23 19:08:54.746621 | orchestrator | Print configuration data ------------------------------------------------ 0.78s 2025-09-23 19:08:54.746631 | orchestrator | Add known partitions to the list of available block devices ------------- 0.69s 2025-09-23 19:08:54.746642 | orchestrator | Add known links to the list of available block devices ------------------ 0.69s 2025-09-23 19:08:54.746653 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.68s 2025-09-23 19:08:54.746663 | orchestrator | Generate lvm_volumes structure (block + wal) ---------------------------- 0.65s 2025-09-23 19:08:54.746674 | orchestrator | Set WAL devices config data --------------------------------------------- 0.62s 2025-09-23 19:08:54.746685 | orchestrator | Print DB devices -------------------------------------------------------- 0.59s 2025-09-23 19:08:54.746707 | orchestrator | Add known links to the list of available block devices ------------------ 0.59s 2025-09-23 19:08:55.089609 | orchestrator | Add known links to the list of available block devices ------------------ 0.59s 2025-09-23 19:08:55.089696 | orchestrator | Add known partitions to the list of available block devices ------------- 0.58s 2025-09-23 19:08:55.089705 | orchestrator | Add known links to the list of available block devices ------------------ 0.57s 2025-09-23 19:08:55.089712 | orchestrator | Add known links to the list of available block devices ------------------ 0.56s 2025-09-23 19:08:55.089719 | orchestrator | Define lvm_volumes structures ------------------------------------------- 0.54s 2025-09-23 19:09:17.578277 | orchestrator | 2025-09-23 19:09:17 | INFO  | Task 2d2672ee-3d74-4e6d-95da-3c498ccc91d4 (sync inventory) is running in background. Output coming soon. 2025-09-23 19:09:41.264458 | orchestrator | 2025-09-23 19:09:18 | INFO  | Starting group_vars file reorganization 2025-09-23 19:09:41.264570 | orchestrator | 2025-09-23 19:09:18 | INFO  | Moved 0 file(s) to their respective directories 2025-09-23 19:09:41.264587 | orchestrator | 2025-09-23 19:09:18 | INFO  | Group_vars file reorganization completed 2025-09-23 19:09:41.264600 | orchestrator | 2025-09-23 19:09:20 | INFO  | Starting variable preparation from inventory 2025-09-23 19:09:41.264611 | orchestrator | 2025-09-23 19:09:23 | INFO  | Writing 050-kolla-ceph-rgw-hosts.yml with ceph_rgw_hosts 2025-09-23 19:09:41.264623 | orchestrator | 2025-09-23 19:09:23 | INFO  | Writing 050-infrastructure-cephclient-mons.yml with cephclient_mons 2025-09-23 19:09:41.264634 | orchestrator | 2025-09-23 19:09:23 | INFO  | Writing 050-ceph-cluster-fsid.yml with ceph_cluster_fsid 2025-09-23 19:09:41.264645 | orchestrator | 2025-09-23 19:09:23 | INFO  | 3 file(s) written, 6 host(s) processed 2025-09-23 19:09:41.264656 | orchestrator | 2025-09-23 19:09:23 | INFO  | Variable preparation completed 2025-09-23 19:09:41.264667 | orchestrator | 2025-09-23 19:09:24 | INFO  | Starting inventory overwrite handling 2025-09-23 19:09:41.264678 | orchestrator | 2025-09-23 19:09:24 | INFO  | Handling group overwrites in 99-overwrite 2025-09-23 19:09:41.264690 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removing group frr:children from 60-generic 2025-09-23 19:09:41.264701 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removing group storage:children from 50-kolla 2025-09-23 19:09:41.264712 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removing group netbird:children from 50-infrastructure 2025-09-23 19:09:41.264723 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removing group ceph-rgw from 50-ceph 2025-09-23 19:09:41.264735 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removing group ceph-mds from 50-ceph 2025-09-23 19:09:41.264746 | orchestrator | 2025-09-23 19:09:24 | INFO  | Handling group overwrites in 20-roles 2025-09-23 19:09:41.264757 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removing group k3s_node from 50-infrastructure 2025-09-23 19:09:41.264793 | orchestrator | 2025-09-23 19:09:24 | INFO  | Removed 6 group(s) in total 2025-09-23 19:09:41.264805 | orchestrator | 2025-09-23 19:09:24 | INFO  | Inventory overwrite handling completed 2025-09-23 19:09:41.264816 | orchestrator | 2025-09-23 19:09:25 | INFO  | Starting merge of inventory files 2025-09-23 19:09:41.264826 | orchestrator | 2025-09-23 19:09:25 | INFO  | Inventory files merged successfully 2025-09-23 19:09:41.264837 | orchestrator | 2025-09-23 19:09:28 | INFO  | Generating ClusterShell configuration from Ansible inventory 2025-09-23 19:09:41.264848 | orchestrator | 2025-09-23 19:09:40 | INFO  | Successfully wrote ClusterShell configuration 2025-09-23 19:09:41.264860 | orchestrator | [master e655079] 2025-09-23-19-09 2025-09-23 19:09:41.264872 | orchestrator | 1 file changed, 30 insertions(+), 9 deletions(-) 2025-09-23 19:09:43.557466 | orchestrator | 2025-09-23 19:09:43 | INFO  | Task 33f5f1c5-879c-4fd8-ad0a-13a80b839030 (ceph-create-lvm-devices) was prepared for execution. 2025-09-23 19:09:43.557581 | orchestrator | 2025-09-23 19:09:43 | INFO  | It takes a moment until task 33f5f1c5-879c-4fd8-ad0a-13a80b839030 (ceph-create-lvm-devices) has been started and output is visible here. 2025-09-23 19:09:54.515661 | orchestrator | 2025-09-23 19:09:54.515776 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-09-23 19:09:54.515792 | orchestrator | 2025-09-23 19:09:54.515804 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-23 19:09:54.515816 | orchestrator | Tuesday 23 September 2025 19:09:47 +0000 (0:00:00.296) 0:00:00.296 ***** 2025-09-23 19:09:54.515827 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-09-23 19:09:54.515838 | orchestrator | 2025-09-23 19:09:54.515849 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-23 19:09:54.515860 | orchestrator | Tuesday 23 September 2025 19:09:47 +0000 (0:00:00.214) 0:00:00.511 ***** 2025-09-23 19:09:54.515871 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:09:54.515883 | orchestrator | 2025-09-23 19:09:54.515894 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.515905 | orchestrator | Tuesday 23 September 2025 19:09:47 +0000 (0:00:00.214) 0:00:00.726 ***** 2025-09-23 19:09:54.515916 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-09-23 19:09:54.515928 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-09-23 19:09:54.515939 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-09-23 19:09:54.515950 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-09-23 19:09:54.515960 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-09-23 19:09:54.515971 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-09-23 19:09:54.515982 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-09-23 19:09:54.515993 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-09-23 19:09:54.516003 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-09-23 19:09:54.516014 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-09-23 19:09:54.516025 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-09-23 19:09:54.516035 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-09-23 19:09:54.516046 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-09-23 19:09:54.516057 | orchestrator | 2025-09-23 19:09:54.516067 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516103 | orchestrator | Tuesday 23 September 2025 19:09:48 +0000 (0:00:00.375) 0:00:01.101 ***** 2025-09-23 19:09:54.516114 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516125 | orchestrator | 2025-09-23 19:09:54.516173 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516202 | orchestrator | Tuesday 23 September 2025 19:09:48 +0000 (0:00:00.351) 0:00:01.453 ***** 2025-09-23 19:09:54.516216 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516228 | orchestrator | 2025-09-23 19:09:54.516240 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516253 | orchestrator | Tuesday 23 September 2025 19:09:48 +0000 (0:00:00.198) 0:00:01.652 ***** 2025-09-23 19:09:54.516273 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516285 | orchestrator | 2025-09-23 19:09:54.516297 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516310 | orchestrator | Tuesday 23 September 2025 19:09:48 +0000 (0:00:00.173) 0:00:01.826 ***** 2025-09-23 19:09:54.516323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516343 | orchestrator | 2025-09-23 19:09:54.516368 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516396 | orchestrator | Tuesday 23 September 2025 19:09:49 +0000 (0:00:00.166) 0:00:01.992 ***** 2025-09-23 19:09:54.516414 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516433 | orchestrator | 2025-09-23 19:09:54.516451 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516470 | orchestrator | Tuesday 23 September 2025 19:09:49 +0000 (0:00:00.203) 0:00:02.195 ***** 2025-09-23 19:09:54.516488 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516508 | orchestrator | 2025-09-23 19:09:54.516528 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516548 | orchestrator | Tuesday 23 September 2025 19:09:49 +0000 (0:00:00.196) 0:00:02.391 ***** 2025-09-23 19:09:54.516568 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516582 | orchestrator | 2025-09-23 19:09:54.516592 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516603 | orchestrator | Tuesday 23 September 2025 19:09:49 +0000 (0:00:00.183) 0:00:02.575 ***** 2025-09-23 19:09:54.516614 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.516624 | orchestrator | 2025-09-23 19:09:54.516635 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516645 | orchestrator | Tuesday 23 September 2025 19:09:49 +0000 (0:00:00.182) 0:00:02.758 ***** 2025-09-23 19:09:54.516656 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37) 2025-09-23 19:09:54.516667 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37) 2025-09-23 19:09:54.516678 | orchestrator | 2025-09-23 19:09:54.516689 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516700 | orchestrator | Tuesday 23 September 2025 19:09:50 +0000 (0:00:00.357) 0:00:03.115 ***** 2025-09-23 19:09:54.516730 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5) 2025-09-23 19:09:54.516741 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5) 2025-09-23 19:09:54.516752 | orchestrator | 2025-09-23 19:09:54.516763 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516774 | orchestrator | Tuesday 23 September 2025 19:09:50 +0000 (0:00:00.400) 0:00:03.516 ***** 2025-09-23 19:09:54.516785 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd) 2025-09-23 19:09:54.516795 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd) 2025-09-23 19:09:54.516806 | orchestrator | 2025-09-23 19:09:54.516817 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516839 | orchestrator | Tuesday 23 September 2025 19:09:51 +0000 (0:00:00.611) 0:00:04.127 ***** 2025-09-23 19:09:54.516850 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6) 2025-09-23 19:09:54.516861 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6) 2025-09-23 19:09:54.516871 | orchestrator | 2025-09-23 19:09:54.516882 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:09:54.516893 | orchestrator | Tuesday 23 September 2025 19:09:52 +0000 (0:00:00.845) 0:00:04.972 ***** 2025-09-23 19:09:54.516904 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-23 19:09:54.516914 | orchestrator | 2025-09-23 19:09:54.516925 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.516936 | orchestrator | Tuesday 23 September 2025 19:09:52 +0000 (0:00:00.331) 0:00:05.304 ***** 2025-09-23 19:09:54.516946 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-09-23 19:09:54.516957 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-09-23 19:09:54.516967 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-09-23 19:09:54.516978 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-09-23 19:09:54.516988 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-09-23 19:09:54.516999 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-09-23 19:09:54.517009 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-09-23 19:09:54.517020 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-09-23 19:09:54.517031 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-09-23 19:09:54.517041 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-09-23 19:09:54.517052 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-09-23 19:09:54.517062 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-09-23 19:09:54.517073 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-09-23 19:09:54.517084 | orchestrator | 2025-09-23 19:09:54.517094 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517105 | orchestrator | Tuesday 23 September 2025 19:09:52 +0000 (0:00:00.420) 0:00:05.725 ***** 2025-09-23 19:09:54.517116 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517127 | orchestrator | 2025-09-23 19:09:54.517165 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517176 | orchestrator | Tuesday 23 September 2025 19:09:53 +0000 (0:00:00.202) 0:00:05.927 ***** 2025-09-23 19:09:54.517186 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517197 | orchestrator | 2025-09-23 19:09:54.517208 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517219 | orchestrator | Tuesday 23 September 2025 19:09:53 +0000 (0:00:00.210) 0:00:06.137 ***** 2025-09-23 19:09:54.517229 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517240 | orchestrator | 2025-09-23 19:09:54.517251 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517261 | orchestrator | Tuesday 23 September 2025 19:09:53 +0000 (0:00:00.196) 0:00:06.334 ***** 2025-09-23 19:09:54.517272 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517283 | orchestrator | 2025-09-23 19:09:54.517293 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517312 | orchestrator | Tuesday 23 September 2025 19:09:53 +0000 (0:00:00.203) 0:00:06.537 ***** 2025-09-23 19:09:54.517323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517334 | orchestrator | 2025-09-23 19:09:54.517345 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517355 | orchestrator | Tuesday 23 September 2025 19:09:53 +0000 (0:00:00.211) 0:00:06.748 ***** 2025-09-23 19:09:54.517366 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517377 | orchestrator | 2025-09-23 19:09:54.517387 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517398 | orchestrator | Tuesday 23 September 2025 19:09:54 +0000 (0:00:00.205) 0:00:06.953 ***** 2025-09-23 19:09:54.517409 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:09:54.517419 | orchestrator | 2025-09-23 19:09:54.517430 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:09:54.517441 | orchestrator | Tuesday 23 September 2025 19:09:54 +0000 (0:00:00.203) 0:00:07.157 ***** 2025-09-23 19:09:54.517458 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.306794 | orchestrator | 2025-09-23 19:10:02.306868 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:02.306878 | orchestrator | Tuesday 23 September 2025 19:09:54 +0000 (0:00:00.204) 0:00:07.361 ***** 2025-09-23 19:10:02.306886 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-09-23 19:10:02.306893 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-09-23 19:10:02.306900 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-09-23 19:10:02.306907 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-09-23 19:10:02.306914 | orchestrator | 2025-09-23 19:10:02.306921 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:02.306927 | orchestrator | Tuesday 23 September 2025 19:09:55 +0000 (0:00:01.103) 0:00:08.464 ***** 2025-09-23 19:10:02.306934 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.306941 | orchestrator | 2025-09-23 19:10:02.306948 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:02.306954 | orchestrator | Tuesday 23 September 2025 19:09:55 +0000 (0:00:00.210) 0:00:08.675 ***** 2025-09-23 19:10:02.306961 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.306968 | orchestrator | 2025-09-23 19:10:02.306975 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:02.306981 | orchestrator | Tuesday 23 September 2025 19:09:56 +0000 (0:00:00.189) 0:00:08.865 ***** 2025-09-23 19:10:02.306988 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.306995 | orchestrator | 2025-09-23 19:10:02.307002 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:02.307009 | orchestrator | Tuesday 23 September 2025 19:09:56 +0000 (0:00:00.184) 0:00:09.049 ***** 2025-09-23 19:10:02.307015 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307022 | orchestrator | 2025-09-23 19:10:02.307029 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-09-23 19:10:02.307036 | orchestrator | Tuesday 23 September 2025 19:09:56 +0000 (0:00:00.191) 0:00:09.240 ***** 2025-09-23 19:10:02.307042 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307049 | orchestrator | 2025-09-23 19:10:02.307056 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-09-23 19:10:02.307063 | orchestrator | Tuesday 23 September 2025 19:09:56 +0000 (0:00:00.128) 0:00:09.369 ***** 2025-09-23 19:10:02.307070 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}}) 2025-09-23 19:10:02.307077 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'ad3a695b-9edf-562e-89c9-18fadd13d262'}}) 2025-09-23 19:10:02.307083 | orchestrator | 2025-09-23 19:10:02.307090 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-09-23 19:10:02.307097 | orchestrator | Tuesday 23 September 2025 19:09:56 +0000 (0:00:00.178) 0:00:09.547 ***** 2025-09-23 19:10:02.307117 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}) 2025-09-23 19:10:02.307124 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'}) 2025-09-23 19:10:02.307162 | orchestrator | 2025-09-23 19:10:02.307180 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-09-23 19:10:02.307190 | orchestrator | Tuesday 23 September 2025 19:09:58 +0000 (0:00:02.028) 0:00:11.576 ***** 2025-09-23 19:10:02.307197 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307205 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307211 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307218 | orchestrator | 2025-09-23 19:10:02.307225 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-09-23 19:10:02.307232 | orchestrator | Tuesday 23 September 2025 19:09:58 +0000 (0:00:00.142) 0:00:11.719 ***** 2025-09-23 19:10:02.307238 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}) 2025-09-23 19:10:02.307245 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'}) 2025-09-23 19:10:02.307252 | orchestrator | 2025-09-23 19:10:02.307258 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-09-23 19:10:02.307265 | orchestrator | Tuesday 23 September 2025 19:10:00 +0000 (0:00:01.471) 0:00:13.190 ***** 2025-09-23 19:10:02.307271 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307279 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307285 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307292 | orchestrator | 2025-09-23 19:10:02.307299 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-09-23 19:10:02.307306 | orchestrator | Tuesday 23 September 2025 19:10:00 +0000 (0:00:00.147) 0:00:13.338 ***** 2025-09-23 19:10:02.307312 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307319 | orchestrator | 2025-09-23 19:10:02.307326 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-09-23 19:10:02.307343 | orchestrator | Tuesday 23 September 2025 19:10:00 +0000 (0:00:00.146) 0:00:13.484 ***** 2025-09-23 19:10:02.307350 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307357 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307364 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307371 | orchestrator | 2025-09-23 19:10:02.307379 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-09-23 19:10:02.307386 | orchestrator | Tuesday 23 September 2025 19:10:00 +0000 (0:00:00.282) 0:00:13.767 ***** 2025-09-23 19:10:02.307394 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307401 | orchestrator | 2025-09-23 19:10:02.307409 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-09-23 19:10:02.307416 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.121) 0:00:13.888 ***** 2025-09-23 19:10:02.307424 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307437 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307445 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307453 | orchestrator | 2025-09-23 19:10:02.307460 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-09-23 19:10:02.307468 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.143) 0:00:14.032 ***** 2025-09-23 19:10:02.307475 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307482 | orchestrator | 2025-09-23 19:10:02.307490 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-09-23 19:10:02.307497 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.143) 0:00:14.175 ***** 2025-09-23 19:10:02.307505 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307513 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307520 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307527 | orchestrator | 2025-09-23 19:10:02.307535 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-09-23 19:10:02.307543 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.153) 0:00:14.328 ***** 2025-09-23 19:10:02.307550 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:02.307558 | orchestrator | 2025-09-23 19:10:02.307566 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-09-23 19:10:02.307573 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.127) 0:00:14.456 ***** 2025-09-23 19:10:02.307584 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307592 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307599 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307607 | orchestrator | 2025-09-23 19:10:02.307615 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-09-23 19:10:02.307623 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.136) 0:00:14.593 ***** 2025-09-23 19:10:02.307630 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307637 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307645 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307653 | orchestrator | 2025-09-23 19:10:02.307660 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-09-23 19:10:02.307668 | orchestrator | Tuesday 23 September 2025 19:10:01 +0000 (0:00:00.167) 0:00:14.761 ***** 2025-09-23 19:10:02.307675 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:02.307681 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:02.307688 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307695 | orchestrator | 2025-09-23 19:10:02.307702 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-09-23 19:10:02.307708 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.154) 0:00:14.915 ***** 2025-09-23 19:10:02.307715 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307725 | orchestrator | 2025-09-23 19:10:02.307732 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-09-23 19:10:02.307739 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.116) 0:00:15.032 ***** 2025-09-23 19:10:02.307746 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:02.307752 | orchestrator | 2025-09-23 19:10:02.307762 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-09-23 19:10:08.457089 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.122) 0:00:15.154 ***** 2025-09-23 19:10:08.457225 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.457253 | orchestrator | 2025-09-23 19:10:08.457269 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-09-23 19:10:08.457280 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.135) 0:00:15.290 ***** 2025-09-23 19:10:08.457291 | orchestrator | ok: [testbed-node-3] => { 2025-09-23 19:10:08.457302 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-09-23 19:10:08.457313 | orchestrator | } 2025-09-23 19:10:08.457324 | orchestrator | 2025-09-23 19:10:08.457335 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-09-23 19:10:08.457346 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.259) 0:00:15.550 ***** 2025-09-23 19:10:08.457356 | orchestrator | ok: [testbed-node-3] => { 2025-09-23 19:10:08.457367 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-09-23 19:10:08.457378 | orchestrator | } 2025-09-23 19:10:08.457388 | orchestrator | 2025-09-23 19:10:08.457399 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-09-23 19:10:08.457410 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.171) 0:00:15.722 ***** 2025-09-23 19:10:08.457421 | orchestrator | ok: [testbed-node-3] => { 2025-09-23 19:10:08.457432 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-09-23 19:10:08.457442 | orchestrator | } 2025-09-23 19:10:08.457454 | orchestrator | 2025-09-23 19:10:08.457465 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-09-23 19:10:08.457476 | orchestrator | Tuesday 23 September 2025 19:10:02 +0000 (0:00:00.131) 0:00:15.854 ***** 2025-09-23 19:10:08.457487 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:08.457498 | orchestrator | 2025-09-23 19:10:08.457509 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-09-23 19:10:08.457519 | orchestrator | Tuesday 23 September 2025 19:10:03 +0000 (0:00:00.660) 0:00:16.514 ***** 2025-09-23 19:10:08.457530 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:08.457541 | orchestrator | 2025-09-23 19:10:08.457551 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-09-23 19:10:08.457562 | orchestrator | Tuesday 23 September 2025 19:10:04 +0000 (0:00:00.496) 0:00:17.011 ***** 2025-09-23 19:10:08.457573 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:08.457584 | orchestrator | 2025-09-23 19:10:08.457594 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-09-23 19:10:08.457605 | orchestrator | Tuesday 23 September 2025 19:10:04 +0000 (0:00:00.570) 0:00:17.581 ***** 2025-09-23 19:10:08.457616 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:08.457627 | orchestrator | 2025-09-23 19:10:08.457637 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-09-23 19:10:08.457648 | orchestrator | Tuesday 23 September 2025 19:10:04 +0000 (0:00:00.142) 0:00:17.724 ***** 2025-09-23 19:10:08.457659 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.457669 | orchestrator | 2025-09-23 19:10:08.457680 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-09-23 19:10:08.457691 | orchestrator | Tuesday 23 September 2025 19:10:04 +0000 (0:00:00.116) 0:00:17.841 ***** 2025-09-23 19:10:08.457702 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.457712 | orchestrator | 2025-09-23 19:10:08.457723 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-09-23 19:10:08.457734 | orchestrator | Tuesday 23 September 2025 19:10:05 +0000 (0:00:00.118) 0:00:17.960 ***** 2025-09-23 19:10:08.457774 | orchestrator | ok: [testbed-node-3] => { 2025-09-23 19:10:08.457786 | orchestrator |  "vgs_report": { 2025-09-23 19:10:08.457798 | orchestrator |  "vg": [] 2025-09-23 19:10:08.457809 | orchestrator |  } 2025-09-23 19:10:08.457819 | orchestrator | } 2025-09-23 19:10:08.457830 | orchestrator | 2025-09-23 19:10:08.457841 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-09-23 19:10:08.457852 | orchestrator | Tuesday 23 September 2025 19:10:05 +0000 (0:00:00.117) 0:00:18.077 ***** 2025-09-23 19:10:08.457862 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.457873 | orchestrator | 2025-09-23 19:10:08.457884 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-09-23 19:10:08.457894 | orchestrator | Tuesday 23 September 2025 19:10:05 +0000 (0:00:00.138) 0:00:18.216 ***** 2025-09-23 19:10:08.457905 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.457915 | orchestrator | 2025-09-23 19:10:08.457926 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-09-23 19:10:08.457937 | orchestrator | Tuesday 23 September 2025 19:10:05 +0000 (0:00:00.123) 0:00:18.339 ***** 2025-09-23 19:10:08.457947 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.457958 | orchestrator | 2025-09-23 19:10:08.457968 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-09-23 19:10:08.457979 | orchestrator | Tuesday 23 September 2025 19:10:05 +0000 (0:00:00.257) 0:00:18.597 ***** 2025-09-23 19:10:08.457990 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458000 | orchestrator | 2025-09-23 19:10:08.458011 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-09-23 19:10:08.458091 | orchestrator | Tuesday 23 September 2025 19:10:05 +0000 (0:00:00.140) 0:00:18.738 ***** 2025-09-23 19:10:08.458104 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458114 | orchestrator | 2025-09-23 19:10:08.458172 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-09-23 19:10:08.458184 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.135) 0:00:18.873 ***** 2025-09-23 19:10:08.458196 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458206 | orchestrator | 2025-09-23 19:10:08.458217 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-09-23 19:10:08.458228 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.140) 0:00:19.013 ***** 2025-09-23 19:10:08.458239 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458250 | orchestrator | 2025-09-23 19:10:08.458260 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-09-23 19:10:08.458271 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.143) 0:00:19.157 ***** 2025-09-23 19:10:08.458282 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458293 | orchestrator | 2025-09-23 19:10:08.458303 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-09-23 19:10:08.458334 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.136) 0:00:19.293 ***** 2025-09-23 19:10:08.458345 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458356 | orchestrator | 2025-09-23 19:10:08.458367 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-09-23 19:10:08.458378 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.141) 0:00:19.435 ***** 2025-09-23 19:10:08.458389 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458399 | orchestrator | 2025-09-23 19:10:08.458410 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-09-23 19:10:08.458421 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.128) 0:00:19.564 ***** 2025-09-23 19:10:08.458432 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458442 | orchestrator | 2025-09-23 19:10:08.458453 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-09-23 19:10:08.458464 | orchestrator | Tuesday 23 September 2025 19:10:06 +0000 (0:00:00.198) 0:00:19.763 ***** 2025-09-23 19:10:08.458475 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458485 | orchestrator | 2025-09-23 19:10:08.458507 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-09-23 19:10:08.458518 | orchestrator | Tuesday 23 September 2025 19:10:07 +0000 (0:00:00.172) 0:00:19.935 ***** 2025-09-23 19:10:08.458529 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458540 | orchestrator | 2025-09-23 19:10:08.458551 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-09-23 19:10:08.458562 | orchestrator | Tuesday 23 September 2025 19:10:07 +0000 (0:00:00.153) 0:00:20.088 ***** 2025-09-23 19:10:08.458573 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458583 | orchestrator | 2025-09-23 19:10:08.458594 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-09-23 19:10:08.458605 | orchestrator | Tuesday 23 September 2025 19:10:07 +0000 (0:00:00.178) 0:00:20.267 ***** 2025-09-23 19:10:08.458617 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:08.458629 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:08.458639 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458650 | orchestrator | 2025-09-23 19:10:08.458661 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-09-23 19:10:08.458672 | orchestrator | Tuesday 23 September 2025 19:10:07 +0000 (0:00:00.275) 0:00:20.542 ***** 2025-09-23 19:10:08.458683 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:08.458694 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:08.458705 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458715 | orchestrator | 2025-09-23 19:10:08.458726 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-09-23 19:10:08.458737 | orchestrator | Tuesday 23 September 2025 19:10:07 +0000 (0:00:00.176) 0:00:20.719 ***** 2025-09-23 19:10:08.458753 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:08.458764 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:08.458774 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458785 | orchestrator | 2025-09-23 19:10:08.458796 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-09-23 19:10:08.458807 | orchestrator | Tuesday 23 September 2025 19:10:08 +0000 (0:00:00.147) 0:00:20.867 ***** 2025-09-23 19:10:08.458818 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:08.458829 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:08.458839 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458850 | orchestrator | 2025-09-23 19:10:08.458861 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-09-23 19:10:08.458871 | orchestrator | Tuesday 23 September 2025 19:10:08 +0000 (0:00:00.151) 0:00:21.018 ***** 2025-09-23 19:10:08.458882 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:08.458901 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:08.458914 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:08.458931 | orchestrator | 2025-09-23 19:10:08.458942 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-09-23 19:10:08.458953 | orchestrator | Tuesday 23 September 2025 19:10:08 +0000 (0:00:00.152) 0:00:21.171 ***** 2025-09-23 19:10:08.458963 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:08.458981 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:13.454925 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:13.455002 | orchestrator | 2025-09-23 19:10:13.455021 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-09-23 19:10:13.455037 | orchestrator | Tuesday 23 September 2025 19:10:08 +0000 (0:00:00.137) 0:00:21.308 ***** 2025-09-23 19:10:13.455051 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:13.455066 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:13.455081 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:13.455095 | orchestrator | 2025-09-23 19:10:13.455108 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-09-23 19:10:13.455141 | orchestrator | Tuesday 23 September 2025 19:10:08 +0000 (0:00:00.124) 0:00:21.433 ***** 2025-09-23 19:10:13.455157 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:13.455171 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:13.455184 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:13.455197 | orchestrator | 2025-09-23 19:10:13.455211 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-09-23 19:10:13.455224 | orchestrator | Tuesday 23 September 2025 19:10:08 +0000 (0:00:00.146) 0:00:21.579 ***** 2025-09-23 19:10:13.455237 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:13.455251 | orchestrator | 2025-09-23 19:10:13.455264 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-09-23 19:10:13.455277 | orchestrator | Tuesday 23 September 2025 19:10:09 +0000 (0:00:00.529) 0:00:22.109 ***** 2025-09-23 19:10:13.455291 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:13.455305 | orchestrator | 2025-09-23 19:10:13.455319 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-09-23 19:10:13.455332 | orchestrator | Tuesday 23 September 2025 19:10:09 +0000 (0:00:00.514) 0:00:22.623 ***** 2025-09-23 19:10:13.455345 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:10:13.455358 | orchestrator | 2025-09-23 19:10:13.455371 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-09-23 19:10:13.455384 | orchestrator | Tuesday 23 September 2025 19:10:09 +0000 (0:00:00.150) 0:00:22.774 ***** 2025-09-23 19:10:13.455397 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'vg_name': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'}) 2025-09-23 19:10:13.455412 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'vg_name': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}) 2025-09-23 19:10:13.455425 | orchestrator | 2025-09-23 19:10:13.455438 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-09-23 19:10:13.455451 | orchestrator | Tuesday 23 September 2025 19:10:10 +0000 (0:00:00.165) 0:00:22.940 ***** 2025-09-23 19:10:13.455464 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:13.455497 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:13.455512 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:13.455525 | orchestrator | 2025-09-23 19:10:13.455538 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-09-23 19:10:13.455552 | orchestrator | Tuesday 23 September 2025 19:10:10 +0000 (0:00:00.272) 0:00:23.212 ***** 2025-09-23 19:10:13.455565 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:13.455579 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:13.455592 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:13.455605 | orchestrator | 2025-09-23 19:10:13.455619 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-09-23 19:10:13.455632 | orchestrator | Tuesday 23 September 2025 19:10:10 +0000 (0:00:00.140) 0:00:23.353 ***** 2025-09-23 19:10:13.455646 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'})  2025-09-23 19:10:13.455660 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'})  2025-09-23 19:10:13.455673 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:10:13.455687 | orchestrator | 2025-09-23 19:10:13.455700 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-09-23 19:10:13.455714 | orchestrator | Tuesday 23 September 2025 19:10:10 +0000 (0:00:00.170) 0:00:23.524 ***** 2025-09-23 19:10:13.455727 | orchestrator | ok: [testbed-node-3] => { 2025-09-23 19:10:13.455741 | orchestrator |  "lvm_report": { 2025-09-23 19:10:13.455755 | orchestrator |  "lv": [ 2025-09-23 19:10:13.455769 | orchestrator |  { 2025-09-23 19:10:13.455797 | orchestrator |  "lv_name": "osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262", 2025-09-23 19:10:13.455812 | orchestrator |  "vg_name": "ceph-ad3a695b-9edf-562e-89c9-18fadd13d262" 2025-09-23 19:10:13.455825 | orchestrator |  }, 2025-09-23 19:10:13.455839 | orchestrator |  { 2025-09-23 19:10:13.455852 | orchestrator |  "lv_name": "osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e", 2025-09-23 19:10:13.455865 | orchestrator |  "vg_name": "ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e" 2025-09-23 19:10:13.455878 | orchestrator |  } 2025-09-23 19:10:13.455891 | orchestrator |  ], 2025-09-23 19:10:13.455904 | orchestrator |  "pv": [ 2025-09-23 19:10:13.455917 | orchestrator |  { 2025-09-23 19:10:13.455931 | orchestrator |  "pv_name": "/dev/sdb", 2025-09-23 19:10:13.455944 | orchestrator |  "vg_name": "ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e" 2025-09-23 19:10:13.455958 | orchestrator |  }, 2025-09-23 19:10:13.455971 | orchestrator |  { 2025-09-23 19:10:13.455984 | orchestrator |  "pv_name": "/dev/sdc", 2025-09-23 19:10:13.455997 | orchestrator |  "vg_name": "ceph-ad3a695b-9edf-562e-89c9-18fadd13d262" 2025-09-23 19:10:13.456010 | orchestrator |  } 2025-09-23 19:10:13.456023 | orchestrator |  ] 2025-09-23 19:10:13.456036 | orchestrator |  } 2025-09-23 19:10:13.456050 | orchestrator | } 2025-09-23 19:10:13.456063 | orchestrator | 2025-09-23 19:10:13.456076 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-09-23 19:10:13.456090 | orchestrator | 2025-09-23 19:10:13.456104 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-23 19:10:13.456118 | orchestrator | Tuesday 23 September 2025 19:10:10 +0000 (0:00:00.296) 0:00:23.820 ***** 2025-09-23 19:10:13.456152 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-09-23 19:10:13.456177 | orchestrator | 2025-09-23 19:10:13.456187 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-23 19:10:13.456196 | orchestrator | Tuesday 23 September 2025 19:10:11 +0000 (0:00:00.219) 0:00:24.040 ***** 2025-09-23 19:10:13.456204 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:13.456213 | orchestrator | 2025-09-23 19:10:13.456222 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456230 | orchestrator | Tuesday 23 September 2025 19:10:11 +0000 (0:00:00.210) 0:00:24.250 ***** 2025-09-23 19:10:13.456251 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-09-23 19:10:13.456260 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-09-23 19:10:13.456269 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-09-23 19:10:13.456277 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-09-23 19:10:13.456286 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-09-23 19:10:13.456294 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-09-23 19:10:13.456303 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-09-23 19:10:13.456315 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-09-23 19:10:13.456324 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-09-23 19:10:13.456332 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-09-23 19:10:13.456341 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-09-23 19:10:13.456349 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-09-23 19:10:13.456357 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-09-23 19:10:13.456366 | orchestrator | 2025-09-23 19:10:13.456374 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456383 | orchestrator | Tuesday 23 September 2025 19:10:11 +0000 (0:00:00.423) 0:00:24.674 ***** 2025-09-23 19:10:13.456392 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456400 | orchestrator | 2025-09-23 19:10:13.456408 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456417 | orchestrator | Tuesday 23 September 2025 19:10:12 +0000 (0:00:00.193) 0:00:24.868 ***** 2025-09-23 19:10:13.456425 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456434 | orchestrator | 2025-09-23 19:10:13.456442 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456451 | orchestrator | Tuesday 23 September 2025 19:10:12 +0000 (0:00:00.183) 0:00:25.051 ***** 2025-09-23 19:10:13.456459 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456468 | orchestrator | 2025-09-23 19:10:13.456476 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456485 | orchestrator | Tuesday 23 September 2025 19:10:12 +0000 (0:00:00.505) 0:00:25.557 ***** 2025-09-23 19:10:13.456493 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456502 | orchestrator | 2025-09-23 19:10:13.456510 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456519 | orchestrator | Tuesday 23 September 2025 19:10:12 +0000 (0:00:00.188) 0:00:25.746 ***** 2025-09-23 19:10:13.456527 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456535 | orchestrator | 2025-09-23 19:10:13.456544 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456552 | orchestrator | Tuesday 23 September 2025 19:10:13 +0000 (0:00:00.176) 0:00:25.923 ***** 2025-09-23 19:10:13.456561 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456569 | orchestrator | 2025-09-23 19:10:13.456583 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:13.456592 | orchestrator | Tuesday 23 September 2025 19:10:13 +0000 (0:00:00.184) 0:00:26.108 ***** 2025-09-23 19:10:13.456600 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:13.456609 | orchestrator | 2025-09-23 19:10:13.456624 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:23.181111 | orchestrator | Tuesday 23 September 2025 19:10:13 +0000 (0:00:00.194) 0:00:26.302 ***** 2025-09-23 19:10:23.181235 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.181251 | orchestrator | 2025-09-23 19:10:23.181263 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:23.181274 | orchestrator | Tuesday 23 September 2025 19:10:13 +0000 (0:00:00.180) 0:00:26.482 ***** 2025-09-23 19:10:23.181285 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f) 2025-09-23 19:10:23.181296 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f) 2025-09-23 19:10:23.181307 | orchestrator | 2025-09-23 19:10:23.181318 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:23.181329 | orchestrator | Tuesday 23 September 2025 19:10:13 +0000 (0:00:00.372) 0:00:26.855 ***** 2025-09-23 19:10:23.181340 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85) 2025-09-23 19:10:23.181351 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85) 2025-09-23 19:10:23.181361 | orchestrator | 2025-09-23 19:10:23.181372 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:23.181383 | orchestrator | Tuesday 23 September 2025 19:10:14 +0000 (0:00:00.412) 0:00:27.268 ***** 2025-09-23 19:10:23.181393 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd) 2025-09-23 19:10:23.181404 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd) 2025-09-23 19:10:23.181415 | orchestrator | 2025-09-23 19:10:23.181426 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:23.181437 | orchestrator | Tuesday 23 September 2025 19:10:14 +0000 (0:00:00.412) 0:00:27.680 ***** 2025-09-23 19:10:23.181447 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8) 2025-09-23 19:10:23.181458 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8) 2025-09-23 19:10:23.181469 | orchestrator | 2025-09-23 19:10:23.181480 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:23.181491 | orchestrator | Tuesday 23 September 2025 19:10:15 +0000 (0:00:00.399) 0:00:28.079 ***** 2025-09-23 19:10:23.181501 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-23 19:10:23.181512 | orchestrator | 2025-09-23 19:10:23.181523 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.181534 | orchestrator | Tuesday 23 September 2025 19:10:15 +0000 (0:00:00.332) 0:00:28.411 ***** 2025-09-23 19:10:23.181544 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-09-23 19:10:23.181569 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-09-23 19:10:23.181581 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-09-23 19:10:23.181592 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-09-23 19:10:23.181602 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-09-23 19:10:23.181613 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-09-23 19:10:23.181623 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-09-23 19:10:23.181652 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-09-23 19:10:23.181664 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-09-23 19:10:23.181676 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-09-23 19:10:23.181688 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-09-23 19:10:23.181700 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-09-23 19:10:23.181711 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-09-23 19:10:23.181723 | orchestrator | 2025-09-23 19:10:23.181736 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.181748 | orchestrator | Tuesday 23 September 2025 19:10:16 +0000 (0:00:00.530) 0:00:28.942 ***** 2025-09-23 19:10:23.181760 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.181772 | orchestrator | 2025-09-23 19:10:23.181784 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.181797 | orchestrator | Tuesday 23 September 2025 19:10:16 +0000 (0:00:00.202) 0:00:29.144 ***** 2025-09-23 19:10:23.181808 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.181820 | orchestrator | 2025-09-23 19:10:23.181833 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.181845 | orchestrator | Tuesday 23 September 2025 19:10:16 +0000 (0:00:00.219) 0:00:29.364 ***** 2025-09-23 19:10:23.181857 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.181869 | orchestrator | 2025-09-23 19:10:23.181881 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.181893 | orchestrator | Tuesday 23 September 2025 19:10:16 +0000 (0:00:00.183) 0:00:29.547 ***** 2025-09-23 19:10:23.181905 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.181917 | orchestrator | 2025-09-23 19:10:23.181946 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.181958 | orchestrator | Tuesday 23 September 2025 19:10:16 +0000 (0:00:00.182) 0:00:29.730 ***** 2025-09-23 19:10:23.181969 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.181979 | orchestrator | 2025-09-23 19:10:23.181990 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182001 | orchestrator | Tuesday 23 September 2025 19:10:17 +0000 (0:00:00.204) 0:00:29.935 ***** 2025-09-23 19:10:23.182055 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182069 | orchestrator | 2025-09-23 19:10:23.182080 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182091 | orchestrator | Tuesday 23 September 2025 19:10:17 +0000 (0:00:00.192) 0:00:30.127 ***** 2025-09-23 19:10:23.182102 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182112 | orchestrator | 2025-09-23 19:10:23.182159 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182170 | orchestrator | Tuesday 23 September 2025 19:10:17 +0000 (0:00:00.203) 0:00:30.331 ***** 2025-09-23 19:10:23.182181 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182192 | orchestrator | 2025-09-23 19:10:23.182203 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182213 | orchestrator | Tuesday 23 September 2025 19:10:17 +0000 (0:00:00.187) 0:00:30.518 ***** 2025-09-23 19:10:23.182224 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-09-23 19:10:23.182235 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-09-23 19:10:23.182246 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-09-23 19:10:23.182257 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-09-23 19:10:23.182267 | orchestrator | 2025-09-23 19:10:23.182279 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182290 | orchestrator | Tuesday 23 September 2025 19:10:18 +0000 (0:00:00.802) 0:00:31.321 ***** 2025-09-23 19:10:23.182309 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182320 | orchestrator | 2025-09-23 19:10:23.182331 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182342 | orchestrator | Tuesday 23 September 2025 19:10:18 +0000 (0:00:00.180) 0:00:31.501 ***** 2025-09-23 19:10:23.182353 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182363 | orchestrator | 2025-09-23 19:10:23.182374 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182385 | orchestrator | Tuesday 23 September 2025 19:10:18 +0000 (0:00:00.172) 0:00:31.673 ***** 2025-09-23 19:10:23.182395 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182406 | orchestrator | 2025-09-23 19:10:23.182417 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:23.182428 | orchestrator | Tuesday 23 September 2025 19:10:19 +0000 (0:00:00.474) 0:00:32.148 ***** 2025-09-23 19:10:23.182439 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182449 | orchestrator | 2025-09-23 19:10:23.182460 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-09-23 19:10:23.182471 | orchestrator | Tuesday 23 September 2025 19:10:19 +0000 (0:00:00.183) 0:00:32.331 ***** 2025-09-23 19:10:23.182481 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182492 | orchestrator | 2025-09-23 19:10:23.182503 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-09-23 19:10:23.182514 | orchestrator | Tuesday 23 September 2025 19:10:19 +0000 (0:00:00.134) 0:00:32.466 ***** 2025-09-23 19:10:23.182525 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '1c8984fd-f811-541c-8648-d34ada8a5304'}}) 2025-09-23 19:10:23.182536 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '8028f60e-1a44-5536-9db2-40f94e230aee'}}) 2025-09-23 19:10:23.182547 | orchestrator | 2025-09-23 19:10:23.182557 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-09-23 19:10:23.182568 | orchestrator | Tuesday 23 September 2025 19:10:19 +0000 (0:00:00.184) 0:00:32.650 ***** 2025-09-23 19:10:23.182580 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'}) 2025-09-23 19:10:23.182591 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'}) 2025-09-23 19:10:23.182602 | orchestrator | 2025-09-23 19:10:23.182613 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-09-23 19:10:23.182624 | orchestrator | Tuesday 23 September 2025 19:10:21 +0000 (0:00:01.875) 0:00:34.526 ***** 2025-09-23 19:10:23.182634 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:23.182646 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:23.182657 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:23.182667 | orchestrator | 2025-09-23 19:10:23.182678 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-09-23 19:10:23.182689 | orchestrator | Tuesday 23 September 2025 19:10:21 +0000 (0:00:00.148) 0:00:34.674 ***** 2025-09-23 19:10:23.182700 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'}) 2025-09-23 19:10:23.182710 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'}) 2025-09-23 19:10:23.182721 | orchestrator | 2025-09-23 19:10:23.182741 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-09-23 19:10:28.330425 | orchestrator | Tuesday 23 September 2025 19:10:23 +0000 (0:00:01.356) 0:00:36.031 ***** 2025-09-23 19:10:28.330538 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.330554 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.330566 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330577 | orchestrator | 2025-09-23 19:10:28.330589 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-09-23 19:10:28.330600 | orchestrator | Tuesday 23 September 2025 19:10:23 +0000 (0:00:00.148) 0:00:36.179 ***** 2025-09-23 19:10:28.330611 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330621 | orchestrator | 2025-09-23 19:10:28.330632 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-09-23 19:10:28.330643 | orchestrator | Tuesday 23 September 2025 19:10:23 +0000 (0:00:00.124) 0:00:36.304 ***** 2025-09-23 19:10:28.330654 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.330679 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.330691 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330702 | orchestrator | 2025-09-23 19:10:28.330712 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-09-23 19:10:28.330723 | orchestrator | Tuesday 23 September 2025 19:10:23 +0000 (0:00:00.138) 0:00:36.442 ***** 2025-09-23 19:10:28.330734 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330744 | orchestrator | 2025-09-23 19:10:28.330755 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-09-23 19:10:28.330766 | orchestrator | Tuesday 23 September 2025 19:10:23 +0000 (0:00:00.137) 0:00:36.580 ***** 2025-09-23 19:10:28.330776 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.330787 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.330798 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330809 | orchestrator | 2025-09-23 19:10:28.330820 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-09-23 19:10:28.330831 | orchestrator | Tuesday 23 September 2025 19:10:23 +0000 (0:00:00.149) 0:00:36.729 ***** 2025-09-23 19:10:28.330846 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330857 | orchestrator | 2025-09-23 19:10:28.330868 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-09-23 19:10:28.330878 | orchestrator | Tuesday 23 September 2025 19:10:24 +0000 (0:00:00.381) 0:00:37.110 ***** 2025-09-23 19:10:28.330889 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.330900 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.330911 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.330921 | orchestrator | 2025-09-23 19:10:28.330933 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-09-23 19:10:28.330943 | orchestrator | Tuesday 23 September 2025 19:10:24 +0000 (0:00:00.128) 0:00:37.239 ***** 2025-09-23 19:10:28.330954 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:28.330965 | orchestrator | 2025-09-23 19:10:28.330976 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-09-23 19:10:28.330989 | orchestrator | Tuesday 23 September 2025 19:10:24 +0000 (0:00:00.125) 0:00:37.365 ***** 2025-09-23 19:10:28.331008 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.331020 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.331033 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331045 | orchestrator | 2025-09-23 19:10:28.331057 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-09-23 19:10:28.331069 | orchestrator | Tuesday 23 September 2025 19:10:24 +0000 (0:00:00.127) 0:00:37.492 ***** 2025-09-23 19:10:28.331081 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.331093 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.331105 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331139 | orchestrator | 2025-09-23 19:10:28.331151 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-09-23 19:10:28.331163 | orchestrator | Tuesday 23 September 2025 19:10:24 +0000 (0:00:00.137) 0:00:37.629 ***** 2025-09-23 19:10:28.331193 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:28.331207 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:28.331219 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331231 | orchestrator | 2025-09-23 19:10:28.331243 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-09-23 19:10:28.331255 | orchestrator | Tuesday 23 September 2025 19:10:24 +0000 (0:00:00.122) 0:00:37.752 ***** 2025-09-23 19:10:28.331267 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331278 | orchestrator | 2025-09-23 19:10:28.331290 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-09-23 19:10:28.331302 | orchestrator | Tuesday 23 September 2025 19:10:25 +0000 (0:00:00.119) 0:00:37.871 ***** 2025-09-23 19:10:28.331314 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331326 | orchestrator | 2025-09-23 19:10:28.331338 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-09-23 19:10:28.331348 | orchestrator | Tuesday 23 September 2025 19:10:25 +0000 (0:00:00.127) 0:00:37.999 ***** 2025-09-23 19:10:28.331359 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331370 | orchestrator | 2025-09-23 19:10:28.331380 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-09-23 19:10:28.331391 | orchestrator | Tuesday 23 September 2025 19:10:25 +0000 (0:00:00.130) 0:00:38.129 ***** 2025-09-23 19:10:28.331402 | orchestrator | ok: [testbed-node-4] => { 2025-09-23 19:10:28.331412 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-09-23 19:10:28.331423 | orchestrator | } 2025-09-23 19:10:28.331434 | orchestrator | 2025-09-23 19:10:28.331445 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-09-23 19:10:28.331456 | orchestrator | Tuesday 23 September 2025 19:10:25 +0000 (0:00:00.144) 0:00:38.274 ***** 2025-09-23 19:10:28.331467 | orchestrator | ok: [testbed-node-4] => { 2025-09-23 19:10:28.331478 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-09-23 19:10:28.331488 | orchestrator | } 2025-09-23 19:10:28.331499 | orchestrator | 2025-09-23 19:10:28.331509 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-09-23 19:10:28.331520 | orchestrator | Tuesday 23 September 2025 19:10:25 +0000 (0:00:00.131) 0:00:38.406 ***** 2025-09-23 19:10:28.331531 | orchestrator | ok: [testbed-node-4] => { 2025-09-23 19:10:28.331541 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-09-23 19:10:28.331559 | orchestrator | } 2025-09-23 19:10:28.331570 | orchestrator | 2025-09-23 19:10:28.331581 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-09-23 19:10:28.331592 | orchestrator | Tuesday 23 September 2025 19:10:25 +0000 (0:00:00.129) 0:00:38.535 ***** 2025-09-23 19:10:28.331602 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:28.331613 | orchestrator | 2025-09-23 19:10:28.331624 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-09-23 19:10:28.331635 | orchestrator | Tuesday 23 September 2025 19:10:26 +0000 (0:00:00.626) 0:00:39.162 ***** 2025-09-23 19:10:28.331650 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:28.331662 | orchestrator | 2025-09-23 19:10:28.331673 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-09-23 19:10:28.331684 | orchestrator | Tuesday 23 September 2025 19:10:26 +0000 (0:00:00.504) 0:00:39.666 ***** 2025-09-23 19:10:28.331694 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:28.331705 | orchestrator | 2025-09-23 19:10:28.331716 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-09-23 19:10:28.331727 | orchestrator | Tuesday 23 September 2025 19:10:27 +0000 (0:00:00.520) 0:00:40.187 ***** 2025-09-23 19:10:28.331737 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:28.331748 | orchestrator | 2025-09-23 19:10:28.331759 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-09-23 19:10:28.331769 | orchestrator | Tuesday 23 September 2025 19:10:27 +0000 (0:00:00.145) 0:00:40.332 ***** 2025-09-23 19:10:28.331780 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331791 | orchestrator | 2025-09-23 19:10:28.331802 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-09-23 19:10:28.331812 | orchestrator | Tuesday 23 September 2025 19:10:27 +0000 (0:00:00.101) 0:00:40.434 ***** 2025-09-23 19:10:28.331823 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331834 | orchestrator | 2025-09-23 19:10:28.331844 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-09-23 19:10:28.331855 | orchestrator | Tuesday 23 September 2025 19:10:27 +0000 (0:00:00.100) 0:00:40.535 ***** 2025-09-23 19:10:28.331866 | orchestrator | ok: [testbed-node-4] => { 2025-09-23 19:10:28.331877 | orchestrator |  "vgs_report": { 2025-09-23 19:10:28.331888 | orchestrator |  "vg": [] 2025-09-23 19:10:28.331899 | orchestrator |  } 2025-09-23 19:10:28.331911 | orchestrator | } 2025-09-23 19:10:28.331921 | orchestrator | 2025-09-23 19:10:28.331932 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-09-23 19:10:28.331943 | orchestrator | Tuesday 23 September 2025 19:10:27 +0000 (0:00:00.125) 0:00:40.660 ***** 2025-09-23 19:10:28.331954 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.331964 | orchestrator | 2025-09-23 19:10:28.331975 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-09-23 19:10:28.331986 | orchestrator | Tuesday 23 September 2025 19:10:27 +0000 (0:00:00.119) 0:00:40.780 ***** 2025-09-23 19:10:28.331996 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.332007 | orchestrator | 2025-09-23 19:10:28.332018 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-09-23 19:10:28.332029 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.121) 0:00:40.901 ***** 2025-09-23 19:10:28.332040 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.332050 | orchestrator | 2025-09-23 19:10:28.332061 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-09-23 19:10:28.332072 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.137) 0:00:41.039 ***** 2025-09-23 19:10:28.332083 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:28.332093 | orchestrator | 2025-09-23 19:10:28.332104 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-09-23 19:10:28.332157 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.137) 0:00:41.177 ***** 2025-09-23 19:10:32.758864 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.758967 | orchestrator | 2025-09-23 19:10:32.759619 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-09-23 19:10:32.759641 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.126) 0:00:41.303 ***** 2025-09-23 19:10:32.759654 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759666 | orchestrator | 2025-09-23 19:10:32.759678 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-09-23 19:10:32.759690 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.265) 0:00:41.569 ***** 2025-09-23 19:10:32.759703 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759715 | orchestrator | 2025-09-23 19:10:32.759727 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-09-23 19:10:32.759739 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.130) 0:00:41.699 ***** 2025-09-23 19:10:32.759752 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759764 | orchestrator | 2025-09-23 19:10:32.759774 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-09-23 19:10:32.759785 | orchestrator | Tuesday 23 September 2025 19:10:28 +0000 (0:00:00.140) 0:00:41.840 ***** 2025-09-23 19:10:32.759795 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759806 | orchestrator | 2025-09-23 19:10:32.759817 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-09-23 19:10:32.759827 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.125) 0:00:41.966 ***** 2025-09-23 19:10:32.759838 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759848 | orchestrator | 2025-09-23 19:10:32.759859 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-09-23 19:10:32.759869 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.136) 0:00:42.102 ***** 2025-09-23 19:10:32.759880 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759890 | orchestrator | 2025-09-23 19:10:32.759901 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-09-23 19:10:32.759911 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.130) 0:00:42.233 ***** 2025-09-23 19:10:32.759922 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759932 | orchestrator | 2025-09-23 19:10:32.759943 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-09-23 19:10:32.759953 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.137) 0:00:42.371 ***** 2025-09-23 19:10:32.759964 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.759974 | orchestrator | 2025-09-23 19:10:32.759985 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-09-23 19:10:32.759995 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.125) 0:00:42.496 ***** 2025-09-23 19:10:32.760006 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760016 | orchestrator | 2025-09-23 19:10:32.760026 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-09-23 19:10:32.760037 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.133) 0:00:42.630 ***** 2025-09-23 19:10:32.760061 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760073 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760083 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760094 | orchestrator | 2025-09-23 19:10:32.760105 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-09-23 19:10:32.760132 | orchestrator | Tuesday 23 September 2025 19:10:29 +0000 (0:00:00.153) 0:00:42.783 ***** 2025-09-23 19:10:32.760143 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760154 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760173 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760184 | orchestrator | 2025-09-23 19:10:32.760195 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-09-23 19:10:32.760205 | orchestrator | Tuesday 23 September 2025 19:10:30 +0000 (0:00:00.144) 0:00:42.928 ***** 2025-09-23 19:10:32.760216 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760226 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760237 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760248 | orchestrator | 2025-09-23 19:10:32.760258 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-09-23 19:10:32.760269 | orchestrator | Tuesday 23 September 2025 19:10:30 +0000 (0:00:00.137) 0:00:43.066 ***** 2025-09-23 19:10:32.760279 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760290 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760300 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760311 | orchestrator | 2025-09-23 19:10:32.760322 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-09-23 19:10:32.760350 | orchestrator | Tuesday 23 September 2025 19:10:30 +0000 (0:00:00.281) 0:00:43.347 ***** 2025-09-23 19:10:32.760361 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760372 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760383 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760394 | orchestrator | 2025-09-23 19:10:32.760404 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-09-23 19:10:32.760415 | orchestrator | Tuesday 23 September 2025 19:10:30 +0000 (0:00:00.142) 0:00:43.490 ***** 2025-09-23 19:10:32.760425 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760436 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760446 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760457 | orchestrator | 2025-09-23 19:10:32.760468 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-09-23 19:10:32.760478 | orchestrator | Tuesday 23 September 2025 19:10:30 +0000 (0:00:00.156) 0:00:43.646 ***** 2025-09-23 19:10:32.760489 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760500 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760510 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760521 | orchestrator | 2025-09-23 19:10:32.760531 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-09-23 19:10:32.760542 | orchestrator | Tuesday 23 September 2025 19:10:30 +0000 (0:00:00.154) 0:00:43.800 ***** 2025-09-23 19:10:32.760552 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760569 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760580 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760590 | orchestrator | 2025-09-23 19:10:32.760601 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-09-23 19:10:32.760646 | orchestrator | Tuesday 23 September 2025 19:10:31 +0000 (0:00:00.152) 0:00:43.953 ***** 2025-09-23 19:10:32.760658 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:32.760669 | orchestrator | 2025-09-23 19:10:32.760680 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-09-23 19:10:32.760691 | orchestrator | Tuesday 23 September 2025 19:10:31 +0000 (0:00:00.509) 0:00:44.463 ***** 2025-09-23 19:10:32.760701 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:32.760712 | orchestrator | 2025-09-23 19:10:32.760723 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-09-23 19:10:32.760733 | orchestrator | Tuesday 23 September 2025 19:10:32 +0000 (0:00:00.557) 0:00:45.021 ***** 2025-09-23 19:10:32.760744 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:10:32.760754 | orchestrator | 2025-09-23 19:10:32.760765 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-09-23 19:10:32.760776 | orchestrator | Tuesday 23 September 2025 19:10:32 +0000 (0:00:00.116) 0:00:45.137 ***** 2025-09-23 19:10:32.760786 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'vg_name': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'}) 2025-09-23 19:10:32.760798 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'vg_name': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'}) 2025-09-23 19:10:32.760808 | orchestrator | 2025-09-23 19:10:32.760819 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-09-23 19:10:32.760830 | orchestrator | Tuesday 23 September 2025 19:10:32 +0000 (0:00:00.153) 0:00:45.291 ***** 2025-09-23 19:10:32.760840 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760851 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760862 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:32.760873 | orchestrator | 2025-09-23 19:10:32.760883 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-09-23 19:10:32.760894 | orchestrator | Tuesday 23 September 2025 19:10:32 +0000 (0:00:00.155) 0:00:45.446 ***** 2025-09-23 19:10:32.760904 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:32.760915 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:32.760933 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:38.373705 | orchestrator | 2025-09-23 19:10:38.373812 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-09-23 19:10:38.373828 | orchestrator | Tuesday 23 September 2025 19:10:32 +0000 (0:00:00.161) 0:00:45.608 ***** 2025-09-23 19:10:38.374324 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'})  2025-09-23 19:10:38.374344 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'})  2025-09-23 19:10:38.374355 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:10:38.374367 | orchestrator | 2025-09-23 19:10:38.374378 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-09-23 19:10:38.374389 | orchestrator | Tuesday 23 September 2025 19:10:32 +0000 (0:00:00.157) 0:00:45.766 ***** 2025-09-23 19:10:38.374419 | orchestrator | ok: [testbed-node-4] => { 2025-09-23 19:10:38.374431 | orchestrator |  "lvm_report": { 2025-09-23 19:10:38.374444 | orchestrator |  "lv": [ 2025-09-23 19:10:38.374455 | orchestrator |  { 2025-09-23 19:10:38.374466 | orchestrator |  "lv_name": "osd-block-1c8984fd-f811-541c-8648-d34ada8a5304", 2025-09-23 19:10:38.374478 | orchestrator |  "vg_name": "ceph-1c8984fd-f811-541c-8648-d34ada8a5304" 2025-09-23 19:10:38.374488 | orchestrator |  }, 2025-09-23 19:10:38.374499 | orchestrator |  { 2025-09-23 19:10:38.374509 | orchestrator |  "lv_name": "osd-block-8028f60e-1a44-5536-9db2-40f94e230aee", 2025-09-23 19:10:38.374520 | orchestrator |  "vg_name": "ceph-8028f60e-1a44-5536-9db2-40f94e230aee" 2025-09-23 19:10:38.374530 | orchestrator |  } 2025-09-23 19:10:38.374541 | orchestrator |  ], 2025-09-23 19:10:38.374552 | orchestrator |  "pv": [ 2025-09-23 19:10:38.374562 | orchestrator |  { 2025-09-23 19:10:38.374573 | orchestrator |  "pv_name": "/dev/sdb", 2025-09-23 19:10:38.374583 | orchestrator |  "vg_name": "ceph-1c8984fd-f811-541c-8648-d34ada8a5304" 2025-09-23 19:10:38.374593 | orchestrator |  }, 2025-09-23 19:10:38.374604 | orchestrator |  { 2025-09-23 19:10:38.374614 | orchestrator |  "pv_name": "/dev/sdc", 2025-09-23 19:10:38.374624 | orchestrator |  "vg_name": "ceph-8028f60e-1a44-5536-9db2-40f94e230aee" 2025-09-23 19:10:38.374634 | orchestrator |  } 2025-09-23 19:10:38.374643 | orchestrator |  ] 2025-09-23 19:10:38.374652 | orchestrator |  } 2025-09-23 19:10:38.374662 | orchestrator | } 2025-09-23 19:10:38.374672 | orchestrator | 2025-09-23 19:10:38.374681 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-09-23 19:10:38.374691 | orchestrator | 2025-09-23 19:10:38.374700 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-23 19:10:38.374710 | orchestrator | Tuesday 23 September 2025 19:10:33 +0000 (0:00:00.417) 0:00:46.183 ***** 2025-09-23 19:10:38.374719 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-09-23 19:10:38.374728 | orchestrator | 2025-09-23 19:10:38.374748 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-23 19:10:38.374758 | orchestrator | Tuesday 23 September 2025 19:10:33 +0000 (0:00:00.245) 0:00:46.428 ***** 2025-09-23 19:10:38.374768 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:38.374778 | orchestrator | 2025-09-23 19:10:38.374788 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.374797 | orchestrator | Tuesday 23 September 2025 19:10:33 +0000 (0:00:00.208) 0:00:46.637 ***** 2025-09-23 19:10:38.374807 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-09-23 19:10:38.374816 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-09-23 19:10:38.374826 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-09-23 19:10:38.374835 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-09-23 19:10:38.374845 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-09-23 19:10:38.374854 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-09-23 19:10:38.374863 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-09-23 19:10:38.374873 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-09-23 19:10:38.374882 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-09-23 19:10:38.374892 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-09-23 19:10:38.374901 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-09-23 19:10:38.374917 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-09-23 19:10:38.374927 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-09-23 19:10:38.374936 | orchestrator | 2025-09-23 19:10:38.374946 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.374955 | orchestrator | Tuesday 23 September 2025 19:10:34 +0000 (0:00:00.376) 0:00:47.013 ***** 2025-09-23 19:10:38.374965 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.374977 | orchestrator | 2025-09-23 19:10:38.374987 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.374996 | orchestrator | Tuesday 23 September 2025 19:10:34 +0000 (0:00:00.203) 0:00:47.216 ***** 2025-09-23 19:10:38.375006 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375015 | orchestrator | 2025-09-23 19:10:38.375025 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375050 | orchestrator | Tuesday 23 September 2025 19:10:34 +0000 (0:00:00.199) 0:00:47.415 ***** 2025-09-23 19:10:38.375060 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375070 | orchestrator | 2025-09-23 19:10:38.375080 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375089 | orchestrator | Tuesday 23 September 2025 19:10:34 +0000 (0:00:00.192) 0:00:47.608 ***** 2025-09-23 19:10:38.375098 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375129 | orchestrator | 2025-09-23 19:10:38.375140 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375150 | orchestrator | Tuesday 23 September 2025 19:10:34 +0000 (0:00:00.182) 0:00:47.791 ***** 2025-09-23 19:10:38.375159 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375169 | orchestrator | 2025-09-23 19:10:38.375178 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375188 | orchestrator | Tuesday 23 September 2025 19:10:35 +0000 (0:00:00.184) 0:00:47.976 ***** 2025-09-23 19:10:38.375197 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375207 | orchestrator | 2025-09-23 19:10:38.375216 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375226 | orchestrator | Tuesday 23 September 2025 19:10:35 +0000 (0:00:00.450) 0:00:48.426 ***** 2025-09-23 19:10:38.375235 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375244 | orchestrator | 2025-09-23 19:10:38.375254 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375263 | orchestrator | Tuesday 23 September 2025 19:10:35 +0000 (0:00:00.198) 0:00:48.625 ***** 2025-09-23 19:10:38.375273 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:38.375282 | orchestrator | 2025-09-23 19:10:38.375291 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375301 | orchestrator | Tuesday 23 September 2025 19:10:35 +0000 (0:00:00.214) 0:00:48.839 ***** 2025-09-23 19:10:38.375310 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705) 2025-09-23 19:10:38.375321 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705) 2025-09-23 19:10:38.375330 | orchestrator | 2025-09-23 19:10:38.375340 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375349 | orchestrator | Tuesday 23 September 2025 19:10:36 +0000 (0:00:00.402) 0:00:49.242 ***** 2025-09-23 19:10:38.375358 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b) 2025-09-23 19:10:38.375368 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b) 2025-09-23 19:10:38.375377 | orchestrator | 2025-09-23 19:10:38.375386 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375396 | orchestrator | Tuesday 23 September 2025 19:10:36 +0000 (0:00:00.387) 0:00:49.629 ***** 2025-09-23 19:10:38.375417 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052) 2025-09-23 19:10:38.375427 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052) 2025-09-23 19:10:38.375436 | orchestrator | 2025-09-23 19:10:38.375446 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375455 | orchestrator | Tuesday 23 September 2025 19:10:37 +0000 (0:00:00.458) 0:00:50.087 ***** 2025-09-23 19:10:38.375465 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e) 2025-09-23 19:10:38.375474 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e) 2025-09-23 19:10:38.375484 | orchestrator | 2025-09-23 19:10:38.375493 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-23 19:10:38.375503 | orchestrator | Tuesday 23 September 2025 19:10:37 +0000 (0:00:00.437) 0:00:50.525 ***** 2025-09-23 19:10:38.375512 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-23 19:10:38.375522 | orchestrator | 2025-09-23 19:10:38.375531 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:38.375541 | orchestrator | Tuesday 23 September 2025 19:10:37 +0000 (0:00:00.322) 0:00:50.847 ***** 2025-09-23 19:10:38.375550 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-09-23 19:10:38.375559 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-09-23 19:10:38.375569 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-09-23 19:10:38.375578 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-09-23 19:10:38.375588 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-09-23 19:10:38.375597 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-09-23 19:10:38.375606 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-09-23 19:10:38.375616 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-09-23 19:10:38.375625 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-09-23 19:10:38.375635 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-09-23 19:10:38.375644 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-09-23 19:10:38.375659 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-09-23 19:10:47.011667 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-09-23 19:10:47.011764 | orchestrator | 2025-09-23 19:10:47.011780 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.011793 | orchestrator | Tuesday 23 September 2025 19:10:38 +0000 (0:00:00.370) 0:00:51.218 ***** 2025-09-23 19:10:47.011804 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.011815 | orchestrator | 2025-09-23 19:10:47.011826 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.011837 | orchestrator | Tuesday 23 September 2025 19:10:38 +0000 (0:00:00.193) 0:00:51.411 ***** 2025-09-23 19:10:47.011848 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.011859 | orchestrator | 2025-09-23 19:10:47.011869 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.011880 | orchestrator | Tuesday 23 September 2025 19:10:38 +0000 (0:00:00.206) 0:00:51.618 ***** 2025-09-23 19:10:47.011891 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.011901 | orchestrator | 2025-09-23 19:10:47.011912 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.011942 | orchestrator | Tuesday 23 September 2025 19:10:39 +0000 (0:00:00.463) 0:00:52.082 ***** 2025-09-23 19:10:47.011953 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.011963 | orchestrator | 2025-09-23 19:10:47.011974 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.011985 | orchestrator | Tuesday 23 September 2025 19:10:39 +0000 (0:00:00.194) 0:00:52.276 ***** 2025-09-23 19:10:47.011995 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012006 | orchestrator | 2025-09-23 19:10:47.012016 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012027 | orchestrator | Tuesday 23 September 2025 19:10:39 +0000 (0:00:00.179) 0:00:52.455 ***** 2025-09-23 19:10:47.012037 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012048 | orchestrator | 2025-09-23 19:10:47.012059 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012070 | orchestrator | Tuesday 23 September 2025 19:10:39 +0000 (0:00:00.185) 0:00:52.641 ***** 2025-09-23 19:10:47.012082 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012094 | orchestrator | 2025-09-23 19:10:47.012169 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012185 | orchestrator | Tuesday 23 September 2025 19:10:39 +0000 (0:00:00.193) 0:00:52.835 ***** 2025-09-23 19:10:47.012198 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012208 | orchestrator | 2025-09-23 19:10:47.012226 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012244 | orchestrator | Tuesday 23 September 2025 19:10:40 +0000 (0:00:00.165) 0:00:53.000 ***** 2025-09-23 19:10:47.012262 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-09-23 19:10:47.012279 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-09-23 19:10:47.012299 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-09-23 19:10:47.012313 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-09-23 19:10:47.012324 | orchestrator | 2025-09-23 19:10:47.012334 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012345 | orchestrator | Tuesday 23 September 2025 19:10:40 +0000 (0:00:00.590) 0:00:53.590 ***** 2025-09-23 19:10:47.012355 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012366 | orchestrator | 2025-09-23 19:10:47.012377 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012387 | orchestrator | Tuesday 23 September 2025 19:10:40 +0000 (0:00:00.183) 0:00:53.774 ***** 2025-09-23 19:10:47.012398 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012408 | orchestrator | 2025-09-23 19:10:47.012419 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012430 | orchestrator | Tuesday 23 September 2025 19:10:41 +0000 (0:00:00.184) 0:00:53.959 ***** 2025-09-23 19:10:47.012440 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012451 | orchestrator | 2025-09-23 19:10:47.012461 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-23 19:10:47.012472 | orchestrator | Tuesday 23 September 2025 19:10:41 +0000 (0:00:00.199) 0:00:54.158 ***** 2025-09-23 19:10:47.012482 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012493 | orchestrator | 2025-09-23 19:10:47.012503 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-09-23 19:10:47.012514 | orchestrator | Tuesday 23 September 2025 19:10:41 +0000 (0:00:00.192) 0:00:54.351 ***** 2025-09-23 19:10:47.012524 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012534 | orchestrator | 2025-09-23 19:10:47.012545 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-09-23 19:10:47.012556 | orchestrator | Tuesday 23 September 2025 19:10:41 +0000 (0:00:00.247) 0:00:54.599 ***** 2025-09-23 19:10:47.012566 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}}) 2025-09-23 19:10:47.012577 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': 'a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}}) 2025-09-23 19:10:47.012598 | orchestrator | 2025-09-23 19:10:47.012609 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-09-23 19:10:47.012620 | orchestrator | Tuesday 23 September 2025 19:10:41 +0000 (0:00:00.186) 0:00:54.785 ***** 2025-09-23 19:10:47.012632 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}) 2025-09-23 19:10:47.012643 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}) 2025-09-23 19:10:47.012654 | orchestrator | 2025-09-23 19:10:47.012665 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-09-23 19:10:47.012695 | orchestrator | Tuesday 23 September 2025 19:10:43 +0000 (0:00:01.897) 0:00:56.682 ***** 2025-09-23 19:10:47.012706 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:47.012718 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:47.012729 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012739 | orchestrator | 2025-09-23 19:10:47.012750 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-09-23 19:10:47.012760 | orchestrator | Tuesday 23 September 2025 19:10:43 +0000 (0:00:00.136) 0:00:56.819 ***** 2025-09-23 19:10:47.012771 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}) 2025-09-23 19:10:47.012796 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}) 2025-09-23 19:10:47.012808 | orchestrator | 2025-09-23 19:10:47.012819 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-09-23 19:10:47.012830 | orchestrator | Tuesday 23 September 2025 19:10:45 +0000 (0:00:01.491) 0:00:58.311 ***** 2025-09-23 19:10:47.012840 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:47.012851 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:47.012862 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012873 | orchestrator | 2025-09-23 19:10:47.012883 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-09-23 19:10:47.012894 | orchestrator | Tuesday 23 September 2025 19:10:45 +0000 (0:00:00.166) 0:00:58.478 ***** 2025-09-23 19:10:47.012904 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012915 | orchestrator | 2025-09-23 19:10:47.012925 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-09-23 19:10:47.012936 | orchestrator | Tuesday 23 September 2025 19:10:45 +0000 (0:00:00.150) 0:00:58.628 ***** 2025-09-23 19:10:47.012946 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:47.012962 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:47.012973 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.012983 | orchestrator | 2025-09-23 19:10:47.012994 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-09-23 19:10:47.013005 | orchestrator | Tuesday 23 September 2025 19:10:45 +0000 (0:00:00.154) 0:00:58.783 ***** 2025-09-23 19:10:47.013015 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.013034 | orchestrator | 2025-09-23 19:10:47.013045 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-09-23 19:10:47.013055 | orchestrator | Tuesday 23 September 2025 19:10:46 +0000 (0:00:00.154) 0:00:58.937 ***** 2025-09-23 19:10:47.013066 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:47.013077 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:47.013088 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.013098 | orchestrator | 2025-09-23 19:10:47.013134 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-09-23 19:10:47.013147 | orchestrator | Tuesday 23 September 2025 19:10:46 +0000 (0:00:00.164) 0:00:59.102 ***** 2025-09-23 19:10:47.013157 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.013168 | orchestrator | 2025-09-23 19:10:47.013179 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-09-23 19:10:47.013189 | orchestrator | Tuesday 23 September 2025 19:10:46 +0000 (0:00:00.151) 0:00:59.254 ***** 2025-09-23 19:10:47.013200 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:47.013212 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:47.013231 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:47.013249 | orchestrator | 2025-09-23 19:10:47.013267 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-09-23 19:10:47.013287 | orchestrator | Tuesday 23 September 2025 19:10:46 +0000 (0:00:00.129) 0:00:59.383 ***** 2025-09-23 19:10:47.013305 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:47.013321 | orchestrator | 2025-09-23 19:10:47.013332 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-09-23 19:10:47.013342 | orchestrator | Tuesday 23 September 2025 19:10:46 +0000 (0:00:00.307) 0:00:59.690 ***** 2025-09-23 19:10:47.013361 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:52.588787 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:52.588877 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.588885 | orchestrator | 2025-09-23 19:10:52.588892 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-09-23 19:10:52.588901 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.171) 0:00:59.862 ***** 2025-09-23 19:10:52.588908 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:52.588915 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:52.588922 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.588929 | orchestrator | 2025-09-23 19:10:52.588934 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-09-23 19:10:52.588941 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.146) 0:01:00.009 ***** 2025-09-23 19:10:52.588947 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:52.588954 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:52.588960 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.588986 | orchestrator | 2025-09-23 19:10:52.588993 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-09-23 19:10:52.588999 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.174) 0:01:00.183 ***** 2025-09-23 19:10:52.589005 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589011 | orchestrator | 2025-09-23 19:10:52.589017 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-09-23 19:10:52.589023 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.132) 0:01:00.316 ***** 2025-09-23 19:10:52.589030 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589036 | orchestrator | 2025-09-23 19:10:52.589041 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-09-23 19:10:52.589047 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.142) 0:01:00.458 ***** 2025-09-23 19:10:52.589053 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589059 | orchestrator | 2025-09-23 19:10:52.589065 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-09-23 19:10:52.589084 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.123) 0:01:00.582 ***** 2025-09-23 19:10:52.589090 | orchestrator | ok: [testbed-node-5] => { 2025-09-23 19:10:52.589096 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-09-23 19:10:52.589117 | orchestrator | } 2025-09-23 19:10:52.589124 | orchestrator | 2025-09-23 19:10:52.589130 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-09-23 19:10:52.589135 | orchestrator | Tuesday 23 September 2025 19:10:47 +0000 (0:00:00.148) 0:01:00.731 ***** 2025-09-23 19:10:52.589141 | orchestrator | ok: [testbed-node-5] => { 2025-09-23 19:10:52.589146 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-09-23 19:10:52.589151 | orchestrator | } 2025-09-23 19:10:52.589157 | orchestrator | 2025-09-23 19:10:52.589163 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-09-23 19:10:52.589169 | orchestrator | Tuesday 23 September 2025 19:10:48 +0000 (0:00:00.127) 0:01:00.859 ***** 2025-09-23 19:10:52.589174 | orchestrator | ok: [testbed-node-5] => { 2025-09-23 19:10:52.589180 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-09-23 19:10:52.589185 | orchestrator | } 2025-09-23 19:10:52.589191 | orchestrator | 2025-09-23 19:10:52.589196 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-09-23 19:10:52.589202 | orchestrator | Tuesday 23 September 2025 19:10:48 +0000 (0:00:00.130) 0:01:00.990 ***** 2025-09-23 19:10:52.589208 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:52.589214 | orchestrator | 2025-09-23 19:10:52.589219 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-09-23 19:10:52.589225 | orchestrator | Tuesday 23 September 2025 19:10:48 +0000 (0:00:00.511) 0:01:01.501 ***** 2025-09-23 19:10:52.589231 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:52.589236 | orchestrator | 2025-09-23 19:10:52.589241 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-09-23 19:10:52.589247 | orchestrator | Tuesday 23 September 2025 19:10:49 +0000 (0:00:00.497) 0:01:01.998 ***** 2025-09-23 19:10:52.589253 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:52.589258 | orchestrator | 2025-09-23 19:10:52.589264 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-09-23 19:10:52.589270 | orchestrator | Tuesday 23 September 2025 19:10:49 +0000 (0:00:00.639) 0:01:02.638 ***** 2025-09-23 19:10:52.589275 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:52.589281 | orchestrator | 2025-09-23 19:10:52.589287 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-09-23 19:10:52.589292 | orchestrator | Tuesday 23 September 2025 19:10:49 +0000 (0:00:00.129) 0:01:02.767 ***** 2025-09-23 19:10:52.589298 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589303 | orchestrator | 2025-09-23 19:10:52.589309 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-09-23 19:10:52.589315 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.097) 0:01:02.865 ***** 2025-09-23 19:10:52.589328 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589334 | orchestrator | 2025-09-23 19:10:52.589340 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-09-23 19:10:52.589347 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.110) 0:01:02.975 ***** 2025-09-23 19:10:52.589353 | orchestrator | ok: [testbed-node-5] => { 2025-09-23 19:10:52.589359 | orchestrator |  "vgs_report": { 2025-09-23 19:10:52.589366 | orchestrator |  "vg": [] 2025-09-23 19:10:52.589388 | orchestrator |  } 2025-09-23 19:10:52.589394 | orchestrator | } 2025-09-23 19:10:52.589400 | orchestrator | 2025-09-23 19:10:52.589405 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-09-23 19:10:52.589411 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.128) 0:01:03.104 ***** 2025-09-23 19:10:52.589417 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589423 | orchestrator | 2025-09-23 19:10:52.589429 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-09-23 19:10:52.589436 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.130) 0:01:03.235 ***** 2025-09-23 19:10:52.589441 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589447 | orchestrator | 2025-09-23 19:10:52.589453 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-09-23 19:10:52.589459 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.105) 0:01:03.341 ***** 2025-09-23 19:10:52.589465 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589471 | orchestrator | 2025-09-23 19:10:52.589477 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-09-23 19:10:52.589483 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.105) 0:01:03.446 ***** 2025-09-23 19:10:52.589489 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589494 | orchestrator | 2025-09-23 19:10:52.589500 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-09-23 19:10:52.589506 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.120) 0:01:03.566 ***** 2025-09-23 19:10:52.589512 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589518 | orchestrator | 2025-09-23 19:10:52.589524 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-09-23 19:10:52.589530 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.128) 0:01:03.694 ***** 2025-09-23 19:10:52.589535 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589541 | orchestrator | 2025-09-23 19:10:52.589547 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-09-23 19:10:52.589553 | orchestrator | Tuesday 23 September 2025 19:10:50 +0000 (0:00:00.131) 0:01:03.826 ***** 2025-09-23 19:10:52.589559 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589565 | orchestrator | 2025-09-23 19:10:52.589571 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-09-23 19:10:52.589577 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.131) 0:01:03.957 ***** 2025-09-23 19:10:52.589583 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589589 | orchestrator | 2025-09-23 19:10:52.589594 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-09-23 19:10:52.589600 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.126) 0:01:04.084 ***** 2025-09-23 19:10:52.589606 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589612 | orchestrator | 2025-09-23 19:10:52.589618 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-09-23 19:10:52.589628 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.256) 0:01:04.340 ***** 2025-09-23 19:10:52.589634 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589640 | orchestrator | 2025-09-23 19:10:52.589646 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-09-23 19:10:52.589652 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.129) 0:01:04.469 ***** 2025-09-23 19:10:52.589657 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589668 | orchestrator | 2025-09-23 19:10:52.589674 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-09-23 19:10:52.589679 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.121) 0:01:04.591 ***** 2025-09-23 19:10:52.589685 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589690 | orchestrator | 2025-09-23 19:10:52.589696 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-09-23 19:10:52.589702 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.106) 0:01:04.698 ***** 2025-09-23 19:10:52.589709 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589715 | orchestrator | 2025-09-23 19:10:52.589721 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-09-23 19:10:52.589727 | orchestrator | Tuesday 23 September 2025 19:10:51 +0000 (0:00:00.153) 0:01:04.851 ***** 2025-09-23 19:10:52.589733 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589738 | orchestrator | 2025-09-23 19:10:52.589744 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-09-23 19:10:52.589750 | orchestrator | Tuesday 23 September 2025 19:10:52 +0000 (0:00:00.129) 0:01:04.981 ***** 2025-09-23 19:10:52.589755 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:52.589761 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:52.589767 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589772 | orchestrator | 2025-09-23 19:10:52.589778 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-09-23 19:10:52.589783 | orchestrator | Tuesday 23 September 2025 19:10:52 +0000 (0:00:00.156) 0:01:05.138 ***** 2025-09-23 19:10:52.589789 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:52.589795 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:52.589801 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:52.589806 | orchestrator | 2025-09-23 19:10:52.589812 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-09-23 19:10:52.589818 | orchestrator | Tuesday 23 September 2025 19:10:52 +0000 (0:00:00.146) 0:01:05.284 ***** 2025-09-23 19:10:52.589829 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613046 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613134 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613145 | orchestrator | 2025-09-23 19:10:55.613153 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-09-23 19:10:55.613161 | orchestrator | Tuesday 23 September 2025 19:10:52 +0000 (0:00:00.154) 0:01:05.439 ***** 2025-09-23 19:10:55.613173 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613180 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613189 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613196 | orchestrator | 2025-09-23 19:10:55.613205 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-09-23 19:10:55.613214 | orchestrator | Tuesday 23 September 2025 19:10:52 +0000 (0:00:00.140) 0:01:05.579 ***** 2025-09-23 19:10:55.613235 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613257 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613266 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613274 | orchestrator | 2025-09-23 19:10:55.613283 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-09-23 19:10:55.613291 | orchestrator | Tuesday 23 September 2025 19:10:52 +0000 (0:00:00.150) 0:01:05.730 ***** 2025-09-23 19:10:55.613300 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613308 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613315 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613322 | orchestrator | 2025-09-23 19:10:55.613329 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-09-23 19:10:55.613337 | orchestrator | Tuesday 23 September 2025 19:10:53 +0000 (0:00:00.147) 0:01:05.878 ***** 2025-09-23 19:10:55.613346 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613354 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613363 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613370 | orchestrator | 2025-09-23 19:10:55.613378 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-09-23 19:10:55.613385 | orchestrator | Tuesday 23 September 2025 19:10:53 +0000 (0:00:00.345) 0:01:06.223 ***** 2025-09-23 19:10:55.613394 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613401 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613409 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613416 | orchestrator | 2025-09-23 19:10:55.613423 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-09-23 19:10:55.613431 | orchestrator | Tuesday 23 September 2025 19:10:53 +0000 (0:00:00.160) 0:01:06.384 ***** 2025-09-23 19:10:55.613438 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:55.613445 | orchestrator | 2025-09-23 19:10:55.613452 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-09-23 19:10:55.613460 | orchestrator | Tuesday 23 September 2025 19:10:54 +0000 (0:00:00.515) 0:01:06.899 ***** 2025-09-23 19:10:55.613467 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:55.613474 | orchestrator | 2025-09-23 19:10:55.613481 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-09-23 19:10:55.613488 | orchestrator | Tuesday 23 September 2025 19:10:54 +0000 (0:00:00.533) 0:01:07.432 ***** 2025-09-23 19:10:55.613495 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:10:55.613502 | orchestrator | 2025-09-23 19:10:55.613509 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-09-23 19:10:55.613516 | orchestrator | Tuesday 23 September 2025 19:10:54 +0000 (0:00:00.162) 0:01:07.595 ***** 2025-09-23 19:10:55.613523 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'vg_name': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}) 2025-09-23 19:10:55.613531 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'vg_name': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}) 2025-09-23 19:10:55.613538 | orchestrator | 2025-09-23 19:10:55.613545 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-09-23 19:10:55.613557 | orchestrator | Tuesday 23 September 2025 19:10:54 +0000 (0:00:00.219) 0:01:07.814 ***** 2025-09-23 19:10:55.613576 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613584 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613590 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613598 | orchestrator | 2025-09-23 19:10:55.613604 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-09-23 19:10:55.613612 | orchestrator | Tuesday 23 September 2025 19:10:55 +0000 (0:00:00.175) 0:01:07.989 ***** 2025-09-23 19:10:55.613619 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613627 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613635 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613642 | orchestrator | 2025-09-23 19:10:55.613650 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-09-23 19:10:55.613658 | orchestrator | Tuesday 23 September 2025 19:10:55 +0000 (0:00:00.158) 0:01:08.148 ***** 2025-09-23 19:10:55.613665 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'})  2025-09-23 19:10:55.613682 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'})  2025-09-23 19:10:55.613689 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:10:55.613695 | orchestrator | 2025-09-23 19:10:55.613702 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-09-23 19:10:55.613718 | orchestrator | Tuesday 23 September 2025 19:10:55 +0000 (0:00:00.154) 0:01:08.302 ***** 2025-09-23 19:10:55.613724 | orchestrator | ok: [testbed-node-5] => { 2025-09-23 19:10:55.613731 | orchestrator |  "lvm_report": { 2025-09-23 19:10:55.613739 | orchestrator |  "lv": [ 2025-09-23 19:10:55.613745 | orchestrator |  { 2025-09-23 19:10:55.613752 | orchestrator |  "lv_name": "osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42", 2025-09-23 19:10:55.613762 | orchestrator |  "vg_name": "ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42" 2025-09-23 19:10:55.613769 | orchestrator |  }, 2025-09-23 19:10:55.613776 | orchestrator |  { 2025-09-23 19:10:55.613783 | orchestrator |  "lv_name": "osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5", 2025-09-23 19:10:55.613789 | orchestrator |  "vg_name": "ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5" 2025-09-23 19:10:55.613796 | orchestrator |  } 2025-09-23 19:10:55.613809 | orchestrator |  ], 2025-09-23 19:10:55.613816 | orchestrator |  "pv": [ 2025-09-23 19:10:55.613823 | orchestrator |  { 2025-09-23 19:10:55.613829 | orchestrator |  "pv_name": "/dev/sdb", 2025-09-23 19:10:55.613836 | orchestrator |  "vg_name": "ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5" 2025-09-23 19:10:55.613843 | orchestrator |  }, 2025-09-23 19:10:55.613850 | orchestrator |  { 2025-09-23 19:10:55.613856 | orchestrator |  "pv_name": "/dev/sdc", 2025-09-23 19:10:55.613863 | orchestrator |  "vg_name": "ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42" 2025-09-23 19:10:55.613870 | orchestrator |  } 2025-09-23 19:10:55.613877 | orchestrator |  ] 2025-09-23 19:10:55.613884 | orchestrator |  } 2025-09-23 19:10:55.613890 | orchestrator | } 2025-09-23 19:10:55.613898 | orchestrator | 2025-09-23 19:10:55.613904 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:10:55.613916 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-09-23 19:10:55.613922 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-09-23 19:10:55.613929 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-09-23 19:10:55.613936 | orchestrator | 2025-09-23 19:10:55.613943 | orchestrator | 2025-09-23 19:10:55.613949 | orchestrator | 2025-09-23 19:10:55.613956 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:10:55.613963 | orchestrator | Tuesday 23 September 2025 19:10:55 +0000 (0:00:00.137) 0:01:08.440 ***** 2025-09-23 19:10:55.613970 | orchestrator | =============================================================================== 2025-09-23 19:10:55.613976 | orchestrator | Create block VGs -------------------------------------------------------- 5.80s 2025-09-23 19:10:55.613983 | orchestrator | Create block LVs -------------------------------------------------------- 4.32s 2025-09-23 19:10:55.613989 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 1.80s 2025-09-23 19:10:55.613996 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.73s 2025-09-23 19:10:55.614002 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.61s 2025-09-23 19:10:55.614008 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.55s 2025-09-23 19:10:55.614066 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.50s 2025-09-23 19:10:55.614075 | orchestrator | Add known partitions to the list of available block devices ------------- 1.32s 2025-09-23 19:10:55.614089 | orchestrator | Add known links to the list of available block devices ------------------ 1.18s 2025-09-23 19:10:55.872958 | orchestrator | Add known partitions to the list of available block devices ------------- 1.10s 2025-09-23 19:10:55.873045 | orchestrator | Print LVM report data --------------------------------------------------- 0.85s 2025-09-23 19:10:55.873060 | orchestrator | Add known links to the list of available block devices ------------------ 0.85s 2025-09-23 19:10:55.873072 | orchestrator | Add known partitions to the list of available block devices ------------- 0.80s 2025-09-23 19:10:55.873083 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.68s 2025-09-23 19:10:55.873094 | orchestrator | Create DB+WAL VGs ------------------------------------------------------- 0.68s 2025-09-23 19:10:55.873154 | orchestrator | Get initial list of available block devices ----------------------------- 0.63s 2025-09-23 19:10:55.873165 | orchestrator | Create DB LVs for ceph_db_wal_devices ----------------------------------- 0.62s 2025-09-23 19:10:55.873176 | orchestrator | Add known links to the list of available block devices ------------------ 0.61s 2025-09-23 19:10:55.873187 | orchestrator | Fail if block LV defined in lvm_volumes is missing ---------------------- 0.60s 2025-09-23 19:10:55.873198 | orchestrator | Add known partitions to the list of available block devices ------------- 0.59s 2025-09-23 19:11:07.977953 | orchestrator | 2025-09-23 19:11:07 | INFO  | Task 92393c15-590a-403e-ac3f-a7307dfd7326 (facts) was prepared for execution. 2025-09-23 19:11:07.978113 | orchestrator | 2025-09-23 19:11:07 | INFO  | It takes a moment until task 92393c15-590a-403e-ac3f-a7307dfd7326 (facts) has been started and output is visible here. 2025-09-23 19:11:19.706282 | orchestrator | 2025-09-23 19:11:19.706400 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-09-23 19:11:19.706417 | orchestrator | 2025-09-23 19:11:19.706430 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-09-23 19:11:19.706442 | orchestrator | Tuesday 23 September 2025 19:11:11 +0000 (0:00:00.247) 0:00:00.247 ***** 2025-09-23 19:11:19.706453 | orchestrator | ok: [testbed-manager] 2025-09-23 19:11:19.706465 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:11:19.706502 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:11:19.706514 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:11:19.706525 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:11:19.706536 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:11:19.706546 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:11:19.706557 | orchestrator | 2025-09-23 19:11:19.706568 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-09-23 19:11:19.706579 | orchestrator | Tuesday 23 September 2025 19:11:12 +0000 (0:00:01.161) 0:00:01.408 ***** 2025-09-23 19:11:19.706604 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:11:19.706616 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:11:19.706628 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:11:19.706639 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:11:19.706649 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:11:19.706660 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:11:19.706671 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:11:19.706681 | orchestrator | 2025-09-23 19:11:19.706692 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-23 19:11:19.706703 | orchestrator | 2025-09-23 19:11:19.706714 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-23 19:11:19.706725 | orchestrator | Tuesday 23 September 2025 19:11:14 +0000 (0:00:01.266) 0:00:02.675 ***** 2025-09-23 19:11:19.706736 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:11:19.706746 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:11:19.706757 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:11:19.706768 | orchestrator | ok: [testbed-manager] 2025-09-23 19:11:19.706779 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:11:19.706790 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:11:19.706800 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:11:19.706811 | orchestrator | 2025-09-23 19:11:19.706822 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-09-23 19:11:19.706835 | orchestrator | 2025-09-23 19:11:19.706847 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-09-23 19:11:19.706859 | orchestrator | Tuesday 23 September 2025 19:11:18 +0000 (0:00:04.828) 0:00:07.504 ***** 2025-09-23 19:11:19.706872 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:11:19.706884 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:11:19.706896 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:11:19.706909 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:11:19.706921 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:11:19.706933 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:11:19.706944 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:11:19.706956 | orchestrator | 2025-09-23 19:11:19.706968 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:11:19.706981 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.706995 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.707006 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.707017 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.707028 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.707039 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.707072 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:11:19.707093 | orchestrator | 2025-09-23 19:11:19.707105 | orchestrator | 2025-09-23 19:11:19.707116 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:11:19.707126 | orchestrator | Tuesday 23 September 2025 19:11:19 +0000 (0:00:00.499) 0:00:08.003 ***** 2025-09-23 19:11:19.707137 | orchestrator | =============================================================================== 2025-09-23 19:11:19.707148 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.83s 2025-09-23 19:11:19.707159 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.27s 2025-09-23 19:11:19.707169 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.16s 2025-09-23 19:11:19.707180 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.50s 2025-09-23 19:11:31.946756 | orchestrator | 2025-09-23 19:11:31 | INFO  | Task b19a9ad5-443a-40fb-b765-7afe3e0d8c2d (frr) was prepared for execution. 2025-09-23 19:11:31.946877 | orchestrator | 2025-09-23 19:11:31 | INFO  | It takes a moment until task b19a9ad5-443a-40fb-b765-7afe3e0d8c2d (frr) has been started and output is visible here. 2025-09-23 19:11:57.626312 | orchestrator | 2025-09-23 19:11:57.626442 | orchestrator | PLAY [Apply role frr] ********************************************************** 2025-09-23 19:11:57.626466 | orchestrator | 2025-09-23 19:11:57.626488 | orchestrator | TASK [osism.services.frr : Include distribution specific install tasks] ******** 2025-09-23 19:11:57.626506 | orchestrator | Tuesday 23 September 2025 19:11:36 +0000 (0:00:00.273) 0:00:00.274 ***** 2025-09-23 19:11:57.626526 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/frr/tasks/install-Debian-family.yml for testbed-manager 2025-09-23 19:11:57.626545 | orchestrator | 2025-09-23 19:11:57.626564 | orchestrator | TASK [osism.services.frr : Pin frr package version] **************************** 2025-09-23 19:11:57.626581 | orchestrator | Tuesday 23 September 2025 19:11:36 +0000 (0:00:00.250) 0:00:00.524 ***** 2025-09-23 19:11:57.626597 | orchestrator | changed: [testbed-manager] 2025-09-23 19:11:57.626617 | orchestrator | 2025-09-23 19:11:57.626637 | orchestrator | TASK [osism.services.frr : Install frr package] ******************************** 2025-09-23 19:11:57.626657 | orchestrator | Tuesday 23 September 2025 19:11:37 +0000 (0:00:01.263) 0:00:01.788 ***** 2025-09-23 19:11:57.626676 | orchestrator | changed: [testbed-manager] 2025-09-23 19:11:57.626694 | orchestrator | 2025-09-23 19:11:57.626735 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/vtysh.conf] ********************* 2025-09-23 19:11:57.626755 | orchestrator | Tuesday 23 September 2025 19:11:47 +0000 (0:00:09.514) 0:00:11.302 ***** 2025-09-23 19:11:57.626775 | orchestrator | ok: [testbed-manager] 2025-09-23 19:11:57.626796 | orchestrator | 2025-09-23 19:11:57.626814 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/daemons] ************************ 2025-09-23 19:11:57.626835 | orchestrator | Tuesday 23 September 2025 19:11:48 +0000 (0:00:01.254) 0:00:12.557 ***** 2025-09-23 19:11:57.626855 | orchestrator | changed: [testbed-manager] 2025-09-23 19:11:57.626876 | orchestrator | 2025-09-23 19:11:57.626896 | orchestrator | TASK [osism.services.frr : Set _frr_uplinks fact] ****************************** 2025-09-23 19:11:57.626913 | orchestrator | Tuesday 23 September 2025 19:11:49 +0000 (0:00:00.946) 0:00:13.504 ***** 2025-09-23 19:11:57.626932 | orchestrator | ok: [testbed-manager] 2025-09-23 19:11:57.626951 | orchestrator | 2025-09-23 19:11:57.626969 | orchestrator | TASK [osism.services.frr : Check for frr.conf file in the configuration repository] *** 2025-09-23 19:11:57.627018 | orchestrator | Tuesday 23 September 2025 19:11:50 +0000 (0:00:01.161) 0:00:14.665 ***** 2025-09-23 19:11:57.627036 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-23 19:11:57.627053 | orchestrator | 2025-09-23 19:11:57.627072 | orchestrator | TASK [osism.services.frr : Copy file from the configuration repository: /etc/frr/frr.conf] *** 2025-09-23 19:11:57.627091 | orchestrator | Tuesday 23 September 2025 19:11:51 +0000 (0:00:00.798) 0:00:15.464 ***** 2025-09-23 19:11:57.627111 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:11:57.627131 | orchestrator | 2025-09-23 19:11:57.627151 | orchestrator | TASK [osism.services.frr : Copy file from the role: /etc/frr/frr.conf] ********* 2025-09-23 19:11:57.627202 | orchestrator | Tuesday 23 September 2025 19:11:51 +0000 (0:00:00.155) 0:00:15.620 ***** 2025-09-23 19:11:57.627219 | orchestrator | changed: [testbed-manager] 2025-09-23 19:11:57.627235 | orchestrator | 2025-09-23 19:11:57.627251 | orchestrator | TASK [osism.services.frr : Set sysctl parameters] ****************************** 2025-09-23 19:11:57.627266 | orchestrator | Tuesday 23 September 2025 19:11:52 +0000 (0:00:00.947) 0:00:16.567 ***** 2025-09-23 19:11:57.627285 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.ip_forward', 'value': 1}) 2025-09-23 19:11:57.627304 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.send_redirects', 'value': 0}) 2025-09-23 19:11:57.627323 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.accept_redirects', 'value': 0}) 2025-09-23 19:11:57.627344 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.fib_multipath_hash_policy', 'value': 1}) 2025-09-23 19:11:57.627361 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.default.ignore_routes_with_linkdown', 'value': 1}) 2025-09-23 19:11:57.627379 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.rp_filter', 'value': 2}) 2025-09-23 19:11:57.627397 | orchestrator | 2025-09-23 19:11:57.627417 | orchestrator | TASK [osism.services.frr : Manage frr service] ********************************* 2025-09-23 19:11:57.627438 | orchestrator | Tuesday 23 September 2025 19:11:54 +0000 (0:00:02.126) 0:00:18.694 ***** 2025-09-23 19:11:57.627458 | orchestrator | ok: [testbed-manager] 2025-09-23 19:11:57.627477 | orchestrator | 2025-09-23 19:11:57.627496 | orchestrator | RUNNING HANDLER [osism.services.frr : Restart frr service] ********************* 2025-09-23 19:11:57.627515 | orchestrator | Tuesday 23 September 2025 19:11:55 +0000 (0:00:01.318) 0:00:20.013 ***** 2025-09-23 19:11:57.627535 | orchestrator | changed: [testbed-manager] 2025-09-23 19:11:57.627554 | orchestrator | 2025-09-23 19:11:57.627572 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:11:57.627592 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:11:57.627611 | orchestrator | 2025-09-23 19:11:57.627630 | orchestrator | 2025-09-23 19:11:57.627649 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:11:57.627667 | orchestrator | Tuesday 23 September 2025 19:11:57 +0000 (0:00:01.424) 0:00:21.437 ***** 2025-09-23 19:11:57.627686 | orchestrator | =============================================================================== 2025-09-23 19:11:57.627705 | orchestrator | osism.services.frr : Install frr package -------------------------------- 9.51s 2025-09-23 19:11:57.627724 | orchestrator | osism.services.frr : Set sysctl parameters ------------------------------ 2.13s 2025-09-23 19:11:57.627743 | orchestrator | osism.services.frr : Restart frr service -------------------------------- 1.42s 2025-09-23 19:11:57.627760 | orchestrator | osism.services.frr : Manage frr service --------------------------------- 1.32s 2025-09-23 19:11:57.627807 | orchestrator | osism.services.frr : Pin frr package version ---------------------------- 1.26s 2025-09-23 19:11:57.627826 | orchestrator | osism.services.frr : Copy file: /etc/frr/vtysh.conf --------------------- 1.25s 2025-09-23 19:11:57.627845 | orchestrator | osism.services.frr : Set _frr_uplinks fact ------------------------------ 1.16s 2025-09-23 19:11:57.627863 | orchestrator | osism.services.frr : Copy file from the role: /etc/frr/frr.conf --------- 0.95s 2025-09-23 19:11:57.627882 | orchestrator | osism.services.frr : Copy file: /etc/frr/daemons ------------------------ 0.95s 2025-09-23 19:11:57.627902 | orchestrator | osism.services.frr : Check for frr.conf file in the configuration repository --- 0.80s 2025-09-23 19:11:57.627921 | orchestrator | osism.services.frr : Include distribution specific install tasks -------- 0.25s 2025-09-23 19:11:57.627939 | orchestrator | osism.services.frr : Copy file from the configuration repository: /etc/frr/frr.conf --- 0.16s 2025-09-23 19:11:57.947907 | orchestrator | 2025-09-23 19:11:57.949307 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Tue Sep 23 19:11:57 UTC 2025 2025-09-23 19:11:57.949374 | orchestrator | 2025-09-23 19:11:59.808905 | orchestrator | 2025-09-23 19:11:59 | INFO  | Collection nutshell is prepared for execution 2025-09-23 19:11:59.809020 | orchestrator | 2025-09-23 19:11:59 | INFO  | D [0] - dotfiles 2025-09-23 19:12:09.906957 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [0] - homer 2025-09-23 19:12:09.907087 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [0] - netdata 2025-09-23 19:12:09.907103 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [0] - openstackclient 2025-09-23 19:12:09.907115 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [0] - phpmyadmin 2025-09-23 19:12:09.907125 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [0] - common 2025-09-23 19:12:09.910556 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [1] -- loadbalancer 2025-09-23 19:12:09.910582 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [2] --- opensearch 2025-09-23 19:12:09.910594 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [2] --- mariadb-ng 2025-09-23 19:12:09.911081 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [3] ---- horizon 2025-09-23 19:12:09.911102 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [3] ---- keystone 2025-09-23 19:12:09.911412 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [4] ----- neutron 2025-09-23 19:12:09.911433 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [5] ------ wait-for-nova 2025-09-23 19:12:09.911648 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [5] ------ octavia 2025-09-23 19:12:09.913298 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [4] ----- barbican 2025-09-23 19:12:09.913319 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [4] ----- designate 2025-09-23 19:12:09.913330 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [4] ----- ironic 2025-09-23 19:12:09.913341 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [4] ----- placement 2025-09-23 19:12:09.913770 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [4] ----- magnum 2025-09-23 19:12:09.914565 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [1] -- openvswitch 2025-09-23 19:12:09.914585 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [2] --- ovn 2025-09-23 19:12:09.914678 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [1] -- memcached 2025-09-23 19:12:09.915025 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [1] -- redis 2025-09-23 19:12:09.915046 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [1] -- rabbitmq-ng 2025-09-23 19:12:09.915383 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [0] - kubernetes 2025-09-23 19:12:09.917905 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [1] -- kubeconfig 2025-09-23 19:12:09.917935 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [1] -- copy-kubeconfig 2025-09-23 19:12:09.918179 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [0] - ceph 2025-09-23 19:12:09.920446 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [1] -- ceph-pools 2025-09-23 19:12:09.920555 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [2] --- copy-ceph-keys 2025-09-23 19:12:09.920568 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [3] ---- cephclient 2025-09-23 19:12:09.920586 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [4] ----- ceph-bootstrap-dashboard 2025-09-23 19:12:09.920597 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [4] ----- wait-for-keystone 2025-09-23 19:12:09.920842 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [5] ------ kolla-ceph-rgw 2025-09-23 19:12:09.921048 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [5] ------ glance 2025-09-23 19:12:09.921079 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [5] ------ cinder 2025-09-23 19:12:09.921361 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [5] ------ nova 2025-09-23 19:12:09.923222 | orchestrator | 2025-09-23 19:12:09 | INFO  | A [4] ----- prometheus 2025-09-23 19:12:09.923255 | orchestrator | 2025-09-23 19:12:09 | INFO  | D [5] ------ grafana 2025-09-23 19:12:10.102122 | orchestrator | 2025-09-23 19:12:10 | INFO  | All tasks of the collection nutshell are prepared for execution 2025-09-23 19:12:10.102220 | orchestrator | 2025-09-23 19:12:10 | INFO  | Tasks are running in the background 2025-09-23 19:12:12.863829 | orchestrator | 2025-09-23 19:12:12 | INFO  | No task IDs specified, wait for all currently running tasks 2025-09-23 19:12:14.957659 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:14.957705 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:14.959626 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:14.959643 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state STARTED 2025-09-23 19:12:14.959651 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:14.959658 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:14.960268 | orchestrator | 2025-09-23 19:12:14 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:14.960283 | orchestrator | 2025-09-23 19:12:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:17.989556 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:17.989835 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:17.990384 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:17.991011 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state STARTED 2025-09-23 19:12:17.991572 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:17.992143 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:17.992670 | orchestrator | 2025-09-23 19:12:17 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:17.992882 | orchestrator | 2025-09-23 19:12:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:21.029600 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:21.031020 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:21.031462 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:21.031924 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state STARTED 2025-09-23 19:12:21.032467 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:21.032990 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:21.037633 | orchestrator | 2025-09-23 19:12:21 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:21.037657 | orchestrator | 2025-09-23 19:12:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:24.195070 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:24.195259 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:24.195274 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:24.195296 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state STARTED 2025-09-23 19:12:24.195910 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:24.197217 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:24.197367 | orchestrator | 2025-09-23 19:12:24 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:24.197396 | orchestrator | 2025-09-23 19:12:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:27.482177 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:27.482268 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:27.482285 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:27.482297 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state STARTED 2025-09-23 19:12:27.482308 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:27.482319 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:27.482330 | orchestrator | 2025-09-23 19:12:27 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:27.482341 | orchestrator | 2025-09-23 19:12:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:30.549657 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:30.555684 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:30.556257 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:30.558419 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state STARTED 2025-09-23 19:12:30.563653 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:30.565179 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:30.566488 | orchestrator | 2025-09-23 19:12:30 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:30.570256 | orchestrator | 2025-09-23 19:12:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:33.665218 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:33.665728 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:33.667126 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:33.667552 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task 9a7be508-dd7b-4f34-8f22-c234797de858 is in state SUCCESS 2025-09-23 19:12:33.667936 | orchestrator | 2025-09-23 19:12:33.667954 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2025-09-23 19:12:33.667973 | orchestrator | 2025-09-23 19:12:33.667980 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2025-09-23 19:12:33.667987 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:00.520) 0:00:00.520 ***** 2025-09-23 19:12:33.667993 | orchestrator | changed: [testbed-manager] 2025-09-23 19:12:33.667999 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:12:33.668005 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:12:33.668010 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:12:33.668016 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:12:33.668021 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:12:33.668026 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:12:33.668031 | orchestrator | 2025-09-23 19:12:33.668037 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2025-09-23 19:12:33.668043 | orchestrator | Tuesday 23 September 2025 19:12:24 +0000 (0:00:04.134) 0:00:04.655 ***** 2025-09-23 19:12:33.668049 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2025-09-23 19:12:33.668056 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2025-09-23 19:12:33.668062 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2025-09-23 19:12:33.668067 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2025-09-23 19:12:33.668072 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2025-09-23 19:12:33.668078 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2025-09-23 19:12:33.668084 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2025-09-23 19:12:33.668090 | orchestrator | 2025-09-23 19:12:33.668097 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2025-09-23 19:12:33.668103 | orchestrator | Tuesday 23 September 2025 19:12:26 +0000 (0:00:01.844) 0:00:06.499 ***** 2025-09-23 19:12:33.668117 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:25.824593', 'end': '2025-09-23 19:12:25.830104', 'delta': '0:00:00.005511', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668125 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:26.273066', 'end': '2025-09-23 19:12:26.282630', 'delta': '0:00:00.009564', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668134 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:26.073151', 'end': '2025-09-23 19:12:26.078828', 'delta': '0:00:00.005677', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668162 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:26.389653', 'end': '2025-09-23 19:12:26.398440', 'delta': '0:00:00.008787', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668170 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:26.551132', 'end': '2025-09-23 19:12:26.557673', 'delta': '0:00:00.006541', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668176 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:26.262794', 'end': '2025-09-23 19:12:26.268683', 'delta': '0:00:00.005889', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668186 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-23 19:12:26.292396', 'end': '2025-09-23 19:12:26.301408', 'delta': '0:00:00.009012', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-23 19:12:33.668192 | orchestrator | 2025-09-23 19:12:33.668198 | orchestrator | TASK [geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist.] **** 2025-09-23 19:12:33.668205 | orchestrator | Tuesday 23 September 2025 19:12:27 +0000 (0:00:01.134) 0:00:07.633 ***** 2025-09-23 19:12:33.668219 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2025-09-23 19:12:33.668224 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2025-09-23 19:12:33.668230 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2025-09-23 19:12:33.668235 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2025-09-23 19:12:33.668240 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2025-09-23 19:12:33.668245 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2025-09-23 19:12:33.668250 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2025-09-23 19:12:33.668256 | orchestrator | 2025-09-23 19:12:33.668261 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2025-09-23 19:12:33.668266 | orchestrator | Tuesday 23 September 2025 19:12:29 +0000 (0:00:02.119) 0:00:09.753 ***** 2025-09-23 19:12:33.668271 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2025-09-23 19:12:33.668277 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2025-09-23 19:12:33.668282 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2025-09-23 19:12:33.668287 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2025-09-23 19:12:33.668292 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2025-09-23 19:12:33.668298 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2025-09-23 19:12:33.668303 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2025-09-23 19:12:33.668308 | orchestrator | 2025-09-23 19:12:33.668313 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:12:33.668323 | orchestrator | testbed-manager : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668329 | orchestrator | testbed-node-0 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668334 | orchestrator | testbed-node-1 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668340 | orchestrator | testbed-node-2 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668345 | orchestrator | testbed-node-3 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668350 | orchestrator | testbed-node-4 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668356 | orchestrator | testbed-node-5 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:12:33.668362 | orchestrator | 2025-09-23 19:12:33.668368 | orchestrator | 2025-09-23 19:12:33.668374 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:12:33.668380 | orchestrator | Tuesday 23 September 2025 19:12:32 +0000 (0:00:02.343) 0:00:12.096 ***** 2025-09-23 19:12:33.668385 | orchestrator | =============================================================================== 2025-09-23 19:12:33.668391 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 4.13s 2025-09-23 19:12:33.668396 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 2.34s 2025-09-23 19:12:33.668401 | orchestrator | geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist. ---- 2.12s 2025-09-23 19:12:33.668406 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 1.84s 2025-09-23 19:12:33.668411 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 1.13s 2025-09-23 19:12:33.673229 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:33.676883 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:33.683101 | orchestrator | 2025-09-23 19:12:33 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:33.683130 | orchestrator | 2025-09-23 19:12:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:37.075029 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:37.075129 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:37.075150 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:37.075168 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:37.075187 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:37.075225 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:37.075246 | orchestrator | 2025-09-23 19:12:36 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:37.075262 | orchestrator | 2025-09-23 19:12:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:39.875570 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state STARTED 2025-09-23 19:12:39.875654 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:39.875669 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:39.875680 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:39.875691 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:39.875702 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:39.875713 | orchestrator | 2025-09-23 19:12:39 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:39.875724 | orchestrator | 2025-09-23 19:12:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:43.071849 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:12:43.078148 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task fb7ead46-75a5-42fb-ab97-b3f4ab8d611d is in state SUCCESS 2025-09-23 19:12:43.081043 | orchestrator | 2025-09-23 19:12:43.081084 | orchestrator | 2025-09-23 19:12:43.081097 | orchestrator | PLAY [Apply role common] ******************************************************* 2025-09-23 19:12:43.081109 | orchestrator | 2025-09-23 19:12:43.081121 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-09-23 19:12:43.081132 | orchestrator | Tuesday 23 September 2025 19:12:14 +0000 (0:00:00.213) 0:00:00.213 ***** 2025-09-23 19:12:43.081143 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:12:43.081156 | orchestrator | 2025-09-23 19:12:43.081167 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2025-09-23 19:12:43.081178 | orchestrator | Tuesday 23 September 2025 19:12:15 +0000 (0:00:01.139) 0:00:01.352 ***** 2025-09-23 19:12:43.081189 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081200 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081211 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081239 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081251 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081262 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081272 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081283 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081294 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081304 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081316 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081328 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081339 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081350 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081361 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081371 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-23 19:12:43.081382 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081393 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081428 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-23 19:12:43.081440 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081451 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-23 19:12:43.081463 | orchestrator | 2025-09-23 19:12:43.081474 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-09-23 19:12:43.081485 | orchestrator | Tuesday 23 September 2025 19:12:19 +0000 (0:00:03.982) 0:00:05.334 ***** 2025-09-23 19:12:43.081497 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:12:43.081509 | orchestrator | 2025-09-23 19:12:43.081520 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2025-09-23 19:12:43.081531 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:01.332) 0:00:06.666 ***** 2025-09-23 19:12:43.081546 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.081563 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.081620 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.081644 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.081657 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.081928 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.081948 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.081983 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.081996 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082090 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-23 19:12:43.082115 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082127 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082139 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082166 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082178 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082189 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082201 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082233 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082245 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082256 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082267 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.082278 | orchestrator | 2025-09-23 19:12:43.082290 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2025-09-23 19:12:43.082301 | orchestrator | Tuesday 23 September 2025 19:12:26 +0000 (0:00:05.380) 0:00:12.047 ***** 2025-09-23 19:12:43.082317 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082329 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082340 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082352 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:12:43.082370 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082387 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082399 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082410 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:12:43.082437 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082449 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082476 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082489 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082500 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082518 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082529 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:12:43.082553 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082564 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082576 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082587 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:12:43.082598 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:12:43.082609 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082628 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082640 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082656 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:12:43.082668 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082685 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082697 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082708 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:12:43.082719 | orchestrator | 2025-09-23 19:12:43.082730 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2025-09-23 19:12:43.082741 | orchestrator | Tuesday 23 September 2025 19:12:27 +0000 (0:00:01.293) 0:00:13.340 ***** 2025-09-23 19:12:43.082752 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082763 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082775 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082786 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:12:43.082797 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082814 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082825 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082836 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:12:43.082859 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082871 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082882 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082914 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.082930 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082948 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.082977 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:12:43.082989 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.083007 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.083019 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.083030 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:12:43.083041 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:12:43.083052 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.083063 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.083074 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.083092 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:12:43.083206 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-23 19:12:43.083221 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.083233 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:12:43.083244 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:12:43.083255 | orchestrator | 2025-09-23 19:12:43.083266 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2025-09-23 19:12:43.083277 | orchestrator | Tuesday 23 September 2025 19:12:31 +0000 (0:00:03.833) 0:00:17.174 ***** 2025-09-23 19:12:43.083288 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:12:43.083298 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:12:43.083309 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:12:43.083320 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:12:43.083330 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:12:43.083347 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:12:43.083359 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:12:43.083369 | orchestrator | 2025-09-23 19:12:43.083380 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2025-09-23 19:12:43.083391 | orchestrator | Tuesday 23 September 2025 19:12:32 +0000 (0:00:01.461) 0:00:18.635 ***** 2025-09-23 19:12:43.083402 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:12:43.083413 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:12:43.083423 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:12:43.083434 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:12:43.083444 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:12:43.083455 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:12:43.083466 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:12:43.083476 | orchestrator | 2025-09-23 19:12:43.083487 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2025-09-23 19:12:43.083498 | orchestrator | Tuesday 23 September 2025 19:12:34 +0000 (0:00:01.977) 0:00:20.613 ***** 2025-09-23 19:12:43.083510 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083543 | orchestrator | failed: [testbed-manager] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083557 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083581 | orchestrator | failed: [testbed-node-0] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083602 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083620 | orchestrator | failed: [testbed-node-1] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083645 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083667 | orchestrator | failed: [testbed-node-2] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083687 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083708 | orchestrator | failed: [testbed-node-3] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083726 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083738 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083750 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083774 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083799 | orchestrator | failed: [testbed-node-4] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083819 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-23 19:12:43.083839 | orchestrator | failed: [testbed-node-5] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-23 19:12:43.083862 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083876 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083895 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083909 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083928 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083942 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.083989 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.084004 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.084017 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.084030 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.084043 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:12:43.084055 | orchestrator | 2025-09-23 19:12:43.084068 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:12:43.084086 | orchestrator | testbed-manager : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084107 | orchestrator | testbed-node-0 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084120 | orchestrator | testbed-node-1 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084133 | orchestrator | testbed-node-2 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084145 | orchestrator | testbed-node-3 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084157 | orchestrator | testbed-node-4 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084171 | orchestrator | testbed-node-5 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:12:43.084183 | orchestrator | 2025-09-23 19:12:43.084195 | orchestrator | 2025-09-23 19:12:43.084206 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:12:43.084216 | orchestrator | Tuesday 23 September 2025 19:12:39 +0000 (0:00:05.029) 0:00:25.643 ***** 2025-09-23 19:12:43.084227 | orchestrator | =============================================================================== 2025-09-23 19:12:43.084238 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 5.38s 2025-09-23 19:12:43.084249 | orchestrator | common : Copying over config.json files for services -------------------- 5.03s 2025-09-23 19:12:43.084260 | orchestrator | common : Ensuring config directories exist ------------------------------ 3.98s 2025-09-23 19:12:43.084271 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 3.83s 2025-09-23 19:12:43.084282 | orchestrator | common : Restart systemd-tmpfiles --------------------------------------- 1.98s 2025-09-23 19:12:43.084292 | orchestrator | common : Copying over /run subdirectories conf -------------------------- 1.46s 2025-09-23 19:12:43.084303 | orchestrator | common : include_tasks -------------------------------------------------- 1.33s 2025-09-23 19:12:43.084314 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 1.29s 2025-09-23 19:12:43.084325 | orchestrator | common : include_tasks -------------------------------------------------- 1.14s 2025-09-23 19:12:43.084340 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:43.086856 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:43.087675 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:12:43.090276 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:12:43.090984 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:43.094781 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:43.095384 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:12:43.096412 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:43.099009 | orchestrator | 2025-09-23 19:12:43 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:43.099036 | orchestrator | 2025-09-23 19:12:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:46.145782 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:12:46.146604 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:46.147126 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:46.148055 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:12:46.149366 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:12:46.150420 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:46.150842 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:46.151988 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:12:46.153718 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:46.154638 | orchestrator | 2025-09-23 19:12:46 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:46.154672 | orchestrator | 2025-09-23 19:12:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:49.225011 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:12:49.225295 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:49.225323 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:49.227528 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:12:49.227552 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:12:49.227563 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:49.227574 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:49.230940 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:12:49.231061 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:49.232533 | orchestrator | 2025-09-23 19:12:49 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:49.232562 | orchestrator | 2025-09-23 19:12:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:52.264607 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:12:52.264838 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:52.266197 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:52.267331 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:12:52.268943 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:12:52.269622 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:52.270884 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:52.272588 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:12:52.273470 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:52.276195 | orchestrator | 2025-09-23 19:12:52 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:52.276222 | orchestrator | 2025-09-23 19:12:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:55.377103 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:12:55.377187 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:55.377202 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:55.377215 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:12:55.377228 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:12:55.377239 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:55.377251 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:55.377263 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:12:55.377274 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:55.377285 | orchestrator | 2025-09-23 19:12:55 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:55.377297 | orchestrator | 2025-09-23 19:12:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:12:58.526767 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:12:58.526853 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state STARTED 2025-09-23 19:12:58.526867 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:12:58.526878 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:12:58.526889 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:12:58.526900 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:12:58.526910 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:12:58.526921 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:12:58.526932 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:12:58.526982 | orchestrator | 2025-09-23 19:12:58 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:12:58.526994 | orchestrator | 2025-09-23 19:12:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:01.478301 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:01.482714 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task e928d8c9-1bc0-440a-b4fe-d35aa949cd99 is in state SUCCESS 2025-09-23 19:13:01.483700 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:13:01.486895 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:01.489590 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:01.492127 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:01.496028 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:01.505002 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:01.511271 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:01.513481 | orchestrator | 2025-09-23 19:13:01 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:01.513502 | orchestrator | 2025-09-23 19:13:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:04.782119 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:04.782180 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state STARTED 2025-09-23 19:13:04.782189 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:04.782196 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:04.782202 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:04.782209 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:04.782215 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:04.782221 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:04.782227 | orchestrator | 2025-09-23 19:13:04 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:04.782233 | orchestrator | 2025-09-23 19:13:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:08.057054 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:08.060445 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task b55bfa23-eac7-4809-877c-0becf390ef9b is in state SUCCESS 2025-09-23 19:13:08.063729 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:08.080061 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:08.086282 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:08.094468 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:08.100395 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:08.104342 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:08.105402 | orchestrator | 2025-09-23 19:13:08 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:08.105426 | orchestrator | 2025-09-23 19:13:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:11.284729 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:11.284818 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:11.284833 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:11.284845 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:11.284856 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:11.284867 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:11.284878 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:11.284888 | orchestrator | 2025-09-23 19:13:11 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:11.284914 | orchestrator | 2025-09-23 19:13:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:14.333401 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:14.333483 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:14.333758 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:14.334675 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:14.337598 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:14.337643 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:14.338332 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:14.342103 | orchestrator | 2025-09-23 19:13:14 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:14.342170 | orchestrator | 2025-09-23 19:13:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:17.418350 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:17.418447 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:17.418463 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:17.418475 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:17.418486 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:17.419602 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:17.419637 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:17.421601 | orchestrator | 2025-09-23 19:13:17 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:17.421660 | orchestrator | 2025-09-23 19:13:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:20.468149 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:20.468473 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:20.469176 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:20.471052 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:20.471640 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:20.472271 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:20.473524 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:20.474783 | orchestrator | 2025-09-23 19:13:20 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:20.474808 | orchestrator | 2025-09-23 19:13:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:23.522275 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:23.522483 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:23.523991 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:23.524954 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state STARTED 2025-09-23 19:13:23.526759 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:23.528398 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:23.529214 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:23.531434 | orchestrator | 2025-09-23 19:13:23 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:23.531474 | orchestrator | 2025-09-23 19:13:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:26.588282 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:26.589182 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state STARTED 2025-09-23 19:13:26.589969 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:26.591764 | orchestrator | 2025-09-23 19:13:26.591797 | orchestrator | 2025-09-23 19:13:26.591810 | orchestrator | PLAY [Apply role homer] ******************************************************** 2025-09-23 19:13:26.591823 | orchestrator | 2025-09-23 19:13:26.591835 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2025-09-23 19:13:26.591847 | orchestrator | Tuesday 23 September 2025 19:12:21 +0000 (0:00:00.690) 0:00:00.690 ***** 2025-09-23 19:13:26.591903 | orchestrator | ok: [testbed-manager] => { 2025-09-23 19:13:26.591917 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2025-09-23 19:13:26.591929 | orchestrator | } 2025-09-23 19:13:26.591940 | orchestrator | 2025-09-23 19:13:26.591951 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2025-09-23 19:13:26.591962 | orchestrator | Tuesday 23 September 2025 19:12:21 +0000 (0:00:00.462) 0:00:01.153 ***** 2025-09-23 19:13:26.591993 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.592005 | orchestrator | 2025-09-23 19:13:26.592016 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2025-09-23 19:13:26.592027 | orchestrator | Tuesday 23 September 2025 19:12:22 +0000 (0:00:01.455) 0:00:02.608 ***** 2025-09-23 19:13:26.592038 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2025-09-23 19:13:26.592048 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2025-09-23 19:13:26.592060 | orchestrator | 2025-09-23 19:13:26.592072 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2025-09-23 19:13:26.592083 | orchestrator | Tuesday 23 September 2025 19:12:23 +0000 (0:00:01.029) 0:00:03.638 ***** 2025-09-23 19:13:26.592093 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592104 | orchestrator | 2025-09-23 19:13:26.592115 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2025-09-23 19:13:26.592125 | orchestrator | Tuesday 23 September 2025 19:12:27 +0000 (0:00:03.464) 0:00:07.103 ***** 2025-09-23 19:13:26.592136 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592147 | orchestrator | 2025-09-23 19:13:26.592158 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2025-09-23 19:13:26.592168 | orchestrator | Tuesday 23 September 2025 19:12:29 +0000 (0:00:02.359) 0:00:09.462 ***** 2025-09-23 19:13:26.592179 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2025-09-23 19:13:26.592190 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.592200 | orchestrator | 2025-09-23 19:13:26.592211 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2025-09-23 19:13:26.592222 | orchestrator | Tuesday 23 September 2025 19:12:56 +0000 (0:00:27.164) 0:00:36.627 ***** 2025-09-23 19:13:26.592232 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592243 | orchestrator | 2025-09-23 19:13:26.592254 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:13:26.592265 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.592276 | orchestrator | 2025-09-23 19:13:26.592287 | orchestrator | 2025-09-23 19:13:26.592298 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:13:26.592308 | orchestrator | Tuesday 23 September 2025 19:13:01 +0000 (0:00:04.180) 0:00:40.808 ***** 2025-09-23 19:13:26.592319 | orchestrator | =============================================================================== 2025-09-23 19:13:26.592330 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 27.16s 2025-09-23 19:13:26.592340 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 4.18s 2025-09-23 19:13:26.592351 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 3.47s 2025-09-23 19:13:26.592362 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 2.36s 2025-09-23 19:13:26.592373 | orchestrator | osism.services.homer : Create traefik external network ------------------ 1.46s 2025-09-23 19:13:26.592383 | orchestrator | osism.services.homer : Create required directories ---------------------- 1.03s 2025-09-23 19:13:26.592395 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.46s 2025-09-23 19:13:26.592408 | orchestrator | 2025-09-23 19:13:26.592420 | orchestrator | 2025-09-23 19:13:26.592431 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2025-09-23 19:13:26.592443 | orchestrator | 2025-09-23 19:13:26.592456 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2025-09-23 19:13:26.592467 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:00.248) 0:00:00.248 ***** 2025-09-23 19:13:26.592480 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2025-09-23 19:13:26.592493 | orchestrator | 2025-09-23 19:13:26.592512 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2025-09-23 19:13:26.592525 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:00.532) 0:00:00.781 ***** 2025-09-23 19:13:26.592537 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2025-09-23 19:13:26.592554 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2025-09-23 19:13:26.592566 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2025-09-23 19:13:26.592578 | orchestrator | 2025-09-23 19:13:26.592591 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2025-09-23 19:13:26.592603 | orchestrator | Tuesday 23 September 2025 19:12:22 +0000 (0:00:01.799) 0:00:02.580 ***** 2025-09-23 19:13:26.592615 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592627 | orchestrator | 2025-09-23 19:13:26.592639 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2025-09-23 19:13:26.592652 | orchestrator | Tuesday 23 September 2025 19:12:25 +0000 (0:00:02.447) 0:00:05.028 ***** 2025-09-23 19:13:26.592675 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2025-09-23 19:13:26.592688 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.592700 | orchestrator | 2025-09-23 19:13:26.592712 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2025-09-23 19:13:26.592725 | orchestrator | Tuesday 23 September 2025 19:12:59 +0000 (0:00:34.135) 0:00:39.163 ***** 2025-09-23 19:13:26.592737 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592748 | orchestrator | 2025-09-23 19:13:26.592759 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2025-09-23 19:13:26.592770 | orchestrator | Tuesday 23 September 2025 19:13:00 +0000 (0:00:01.115) 0:00:40.279 ***** 2025-09-23 19:13:26.592893 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.592907 | orchestrator | 2025-09-23 19:13:26.592918 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2025-09-23 19:13:26.592930 | orchestrator | Tuesday 23 September 2025 19:13:01 +0000 (0:00:00.578) 0:00:40.858 ***** 2025-09-23 19:13:26.592941 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592953 | orchestrator | 2025-09-23 19:13:26.592964 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2025-09-23 19:13:26.592976 | orchestrator | Tuesday 23 September 2025 19:13:03 +0000 (0:00:02.517) 0:00:43.375 ***** 2025-09-23 19:13:26.592987 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.592999 | orchestrator | 2025-09-23 19:13:26.593010 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2025-09-23 19:13:26.593022 | orchestrator | Tuesday 23 September 2025 19:13:04 +0000 (0:00:01.133) 0:00:44.509 ***** 2025-09-23 19:13:26.593034 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.593045 | orchestrator | 2025-09-23 19:13:26.593057 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2025-09-23 19:13:26.593068 | orchestrator | Tuesday 23 September 2025 19:13:05 +0000 (0:00:01.189) 0:00:45.698 ***** 2025-09-23 19:13:26.593080 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.593092 | orchestrator | 2025-09-23 19:13:26.593103 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:13:26.593115 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.593127 | orchestrator | 2025-09-23 19:13:26.593139 | orchestrator | 2025-09-23 19:13:26.593150 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:13:26.593162 | orchestrator | Tuesday 23 September 2025 19:13:06 +0000 (0:00:00.397) 0:00:46.095 ***** 2025-09-23 19:13:26.593174 | orchestrator | =============================================================================== 2025-09-23 19:13:26.593186 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 34.14s 2025-09-23 19:13:26.593198 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 2.52s 2025-09-23 19:13:26.593219 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 2.45s 2025-09-23 19:13:26.593231 | orchestrator | osism.services.openstackclient : Create required directories ------------ 1.80s 2025-09-23 19:13:26.593243 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 1.19s 2025-09-23 19:13:26.593255 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 1.13s 2025-09-23 19:13:26.593266 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 1.12s 2025-09-23 19:13:26.593278 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 0.58s 2025-09-23 19:13:26.593290 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.53s 2025-09-23 19:13:26.593301 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 0.40s 2025-09-23 19:13:26.593313 | orchestrator | 2025-09-23 19:13:26.593325 | orchestrator | 2025-09-23 19:13:26.593337 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:13:26.593348 | orchestrator | 2025-09-23 19:13:26.593360 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:13:26.593372 | orchestrator | Tuesday 23 September 2025 19:12:22 +0000 (0:00:00.586) 0:00:00.586 ***** 2025-09-23 19:13:26.593383 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2025-09-23 19:13:26.593395 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2025-09-23 19:13:26.593407 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2025-09-23 19:13:26.593419 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2025-09-23 19:13:26.593430 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2025-09-23 19:13:26.593442 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2025-09-23 19:13:26.593453 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2025-09-23 19:13:26.593465 | orchestrator | 2025-09-23 19:13:26.593477 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2025-09-23 19:13:26.593488 | orchestrator | 2025-09-23 19:13:26.593500 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2025-09-23 19:13:26.593512 | orchestrator | Tuesday 23 September 2025 19:12:23 +0000 (0:00:01.011) 0:00:01.597 ***** 2025-09-23 19:13:26.593530 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:13:26.593545 | orchestrator | 2025-09-23 19:13:26.593557 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2025-09-23 19:13:26.593568 | orchestrator | Tuesday 23 September 2025 19:12:24 +0000 (0:00:01.431) 0:00:03.029 ***** 2025-09-23 19:13:26.593580 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:26.593592 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:26.593603 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:26.593615 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:13:26.593626 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:13:26.593647 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:13:26.593659 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.593670 | orchestrator | 2025-09-23 19:13:26.593682 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2025-09-23 19:13:26.593694 | orchestrator | Tuesday 23 September 2025 19:12:27 +0000 (0:00:02.879) 0:00:05.909 ***** 2025-09-23 19:13:26.593706 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.593717 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:13:26.593729 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:26.593741 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:13:26.593753 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:26.593764 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:26.593776 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:13:26.593788 | orchestrator | 2025-09-23 19:13:26.593800 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2025-09-23 19:13:26.593818 | orchestrator | Tuesday 23 September 2025 19:12:30 +0000 (0:00:03.014) 0:00:08.923 ***** 2025-09-23 19:13:26.593830 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:26.593842 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:26.593854 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:26.593866 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:13:26.593892 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:13:26.593904 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:13:26.593916 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.593928 | orchestrator | 2025-09-23 19:13:26.593940 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2025-09-23 19:13:26.593952 | orchestrator | Tuesday 23 September 2025 19:12:33 +0000 (0:00:02.166) 0:00:11.089 ***** 2025-09-23 19:13:26.593964 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:26.593976 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:26.593987 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:13:26.593999 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:13:26.594011 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:26.594064 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:13:26.594077 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.594089 | orchestrator | 2025-09-23 19:13:26.594101 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2025-09-23 19:13:26.594113 | orchestrator | Tuesday 23 September 2025 19:12:42 +0000 (0:00:09.693) 0:00:20.783 ***** 2025-09-23 19:13:26.594125 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:26.594137 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:26.594148 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:26.594161 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:13:26.594172 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:13:26.594184 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:13:26.594195 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.594207 | orchestrator | 2025-09-23 19:13:26.594219 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2025-09-23 19:13:26.594232 | orchestrator | Tuesday 23 September 2025 19:13:05 +0000 (0:00:22.865) 0:00:43.648 ***** 2025-09-23 19:13:26.594244 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-2, testbed-node-1, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:13:26.594257 | orchestrator | 2025-09-23 19:13:26.594269 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2025-09-23 19:13:26.594281 | orchestrator | Tuesday 23 September 2025 19:13:07 +0000 (0:00:01.581) 0:00:45.229 ***** 2025-09-23 19:13:26.594292 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2025-09-23 19:13:26.594305 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2025-09-23 19:13:26.594317 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2025-09-23 19:13:26.594328 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2025-09-23 19:13:26.594340 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2025-09-23 19:13:26.594352 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2025-09-23 19:13:26.594363 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2025-09-23 19:13:26.594375 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2025-09-23 19:13:26.594387 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2025-09-23 19:13:26.594398 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2025-09-23 19:13:26.594410 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2025-09-23 19:13:26.594422 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2025-09-23 19:13:26.594434 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2025-09-23 19:13:26.594445 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2025-09-23 19:13:26.594457 | orchestrator | 2025-09-23 19:13:26.594476 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2025-09-23 19:13:26.594488 | orchestrator | Tuesday 23 September 2025 19:13:12 +0000 (0:00:05.251) 0:00:50.481 ***** 2025-09-23 19:13:26.594500 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.594512 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:26.594524 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:26.594536 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:26.594547 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:13:26.594559 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:13:26.594571 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:13:26.594582 | orchestrator | 2025-09-23 19:13:26.594599 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2025-09-23 19:13:26.594612 | orchestrator | Tuesday 23 September 2025 19:13:13 +0000 (0:00:01.043) 0:00:51.524 ***** 2025-09-23 19:13:26.594624 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.594636 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:26.594647 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:26.594659 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:26.594671 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:13:26.594683 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:13:26.594694 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:13:26.594706 | orchestrator | 2025-09-23 19:13:26.594718 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2025-09-23 19:13:26.594737 | orchestrator | Tuesday 23 September 2025 19:13:14 +0000 (0:00:01.348) 0:00:52.873 ***** 2025-09-23 19:13:26.594749 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.594761 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:26.594773 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:13:26.594785 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:26.594796 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:26.594808 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:13:26.594820 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:13:26.594832 | orchestrator | 2025-09-23 19:13:26.594844 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2025-09-23 19:13:26.594856 | orchestrator | Tuesday 23 September 2025 19:13:16 +0000 (0:00:01.316) 0:00:54.189 ***** 2025-09-23 19:13:26.594867 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:26.594894 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:26.594906 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:13:26.594918 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:26.594929 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:13:26.594941 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:13:26.594953 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:26.594965 | orchestrator | 2025-09-23 19:13:26.594977 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2025-09-23 19:13:26.594989 | orchestrator | Tuesday 23 September 2025 19:13:18 +0000 (0:00:02.178) 0:00:56.367 ***** 2025-09-23 19:13:26.595001 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2025-09-23 19:13:26.595014 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:13:26.595027 | orchestrator | 2025-09-23 19:13:26.595038 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2025-09-23 19:13:26.595050 | orchestrator | Tuesday 23 September 2025 19:13:19 +0000 (0:00:01.290) 0:00:57.658 ***** 2025-09-23 19:13:26.595062 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.595074 | orchestrator | 2025-09-23 19:13:26.595086 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2025-09-23 19:13:26.595098 | orchestrator | Tuesday 23 September 2025 19:13:21 +0000 (0:00:01.652) 0:00:59.311 ***** 2025-09-23 19:13:26.595110 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:26.595122 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:26.595141 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:26.595153 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:13:26.595165 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:26.595177 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:13:26.595189 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:13:26.595201 | orchestrator | 2025-09-23 19:13:26.595213 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:13:26.595225 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595238 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595250 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595262 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595274 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595286 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595298 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:26.595310 | orchestrator | 2025-09-23 19:13:26.595322 | orchestrator | 2025-09-23 19:13:26.595335 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:13:26.595347 | orchestrator | Tuesday 23 September 2025 19:13:24 +0000 (0:00:03.647) 0:01:02.958 ***** 2025-09-23 19:13:26.595358 | orchestrator | =============================================================================== 2025-09-23 19:13:26.595370 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 22.87s 2025-09-23 19:13:26.595382 | orchestrator | osism.services.netdata : Add repository --------------------------------- 9.69s 2025-09-23 19:13:26.595394 | orchestrator | osism.services.netdata : Copy configuration files ----------------------- 5.25s 2025-09-23 19:13:26.595406 | orchestrator | osism.services.netdata : Restart service netdata ------------------------ 3.65s 2025-09-23 19:13:26.595418 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 3.01s 2025-09-23 19:13:26.595435 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 2.88s 2025-09-23 19:13:26.595446 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 2.18s 2025-09-23 19:13:26.595458 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 2.17s 2025-09-23 19:13:26.595470 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 1.65s 2025-09-23 19:13:26.595482 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 1.58s 2025-09-23 19:13:26.595494 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 1.43s 2025-09-23 19:13:26.595513 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 1.35s 2025-09-23 19:13:26.595524 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 1.32s 2025-09-23 19:13:26.595536 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 1.29s 2025-09-23 19:13:26.595548 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 1.04s 2025-09-23 19:13:26.595559 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.01s 2025-09-23 19:13:26.595571 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 7d413a47-55a1-4ed4-b291-bdfcf9d9c21e is in state SUCCESS 2025-09-23 19:13:26.595583 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state STARTED 2025-09-23 19:13:26.595601 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:26.596800 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:26.599404 | orchestrator | 2025-09-23 19:13:26 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:26.600429 | orchestrator | 2025-09-23 19:13:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:29.627681 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state STARTED 2025-09-23 19:13:29.627900 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 8e739d35-50e2-40cb-bb51-6aca32c93dc3 is in state SUCCESS 2025-09-23 19:13:29.628449 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:29.630118 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 71c1ed46-405c-4efa-bed9-d9ead4dbeabc is in state SUCCESS 2025-09-23 19:13:29.630628 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:29.631254 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:29.635027 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:29.635482 | orchestrator | 2025-09-23 19:13:29 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:29.635504 | orchestrator | 2025-09-23 19:13:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:32.820569 | orchestrator | 2025-09-23 19:13:32.820647 | orchestrator | 2025-09-23 19:13:32.820661 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:13:32.820673 | orchestrator | 2025-09-23 19:13:32.820683 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:13:32.820694 | orchestrator | Tuesday 23 September 2025 19:12:46 +0000 (0:00:00.525) 0:00:00.525 ***** 2025-09-23 19:13:32.820704 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:32.820716 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:32.820726 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:32.820736 | orchestrator | 2025-09-23 19:13:32.820747 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:13:32.820757 | orchestrator | Tuesday 23 September 2025 19:12:48 +0000 (0:00:01.689) 0:00:02.215 ***** 2025-09-23 19:13:32.820768 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2025-09-23 19:13:32.820779 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2025-09-23 19:13:32.820789 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2025-09-23 19:13:32.820800 | orchestrator | 2025-09-23 19:13:32.820810 | orchestrator | PLAY [Apply role memcached] **************************************************** 2025-09-23 19:13:32.820821 | orchestrator | 2025-09-23 19:13:32.820831 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2025-09-23 19:13:32.820842 | orchestrator | Tuesday 23 September 2025 19:12:49 +0000 (0:00:01.464) 0:00:03.679 ***** 2025-09-23 19:13:32.820875 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:13:32.820886 | orchestrator | 2025-09-23 19:13:32.820897 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2025-09-23 19:13:32.820908 | orchestrator | Tuesday 23 September 2025 19:12:52 +0000 (0:00:02.942) 0:00:06.622 ***** 2025-09-23 19:13:32.820918 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-09-23 19:13:32.820929 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-09-23 19:13:32.820939 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-09-23 19:13:32.820971 | orchestrator | 2025-09-23 19:13:32.820996 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2025-09-23 19:13:32.821014 | orchestrator | Tuesday 23 September 2025 19:12:54 +0000 (0:00:02.087) 0:00:08.710 ***** 2025-09-23 19:13:32.821030 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-09-23 19:13:32.821046 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-09-23 19:13:32.821062 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-09-23 19:13:32.821077 | orchestrator | 2025-09-23 19:13:32.821096 | orchestrator | TASK [memcached : Check memcached container] *********************************** 2025-09-23 19:13:32.821113 | orchestrator | Tuesday 23 September 2025 19:12:58 +0000 (0:00:03.892) 0:00:12.602 ***** 2025-09-23 19:13:32.821132 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:32.821144 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:32.821155 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:32.821165 | orchestrator | 2025-09-23 19:13:32.821176 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2025-09-23 19:13:32.821187 | orchestrator | Tuesday 23 September 2025 19:13:04 +0000 (0:00:05.336) 0:00:17.939 ***** 2025-09-23 19:13:32.821197 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:32.821207 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:32.821218 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:32.821228 | orchestrator | 2025-09-23 19:13:32.821240 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:13:32.821251 | orchestrator | testbed-node-0 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.821262 | orchestrator | testbed-node-1 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.821273 | orchestrator | testbed-node-2 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.821284 | orchestrator | 2025-09-23 19:13:32.821295 | orchestrator | 2025-09-23 19:13:32.821306 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:13:32.821317 | orchestrator | Tuesday 23 September 2025 19:13:26 +0000 (0:00:22.872) 0:00:40.811 ***** 2025-09-23 19:13:32.821328 | orchestrator | =============================================================================== 2025-09-23 19:13:32.821338 | orchestrator | memcached : Restart memcached container -------------------------------- 22.87s 2025-09-23 19:13:32.821349 | orchestrator | memcached : Check memcached container ----------------------------------- 5.34s 2025-09-23 19:13:32.821360 | orchestrator | memcached : Copying over config.json files for services ----------------- 3.89s 2025-09-23 19:13:32.821370 | orchestrator | memcached : include_tasks ----------------------------------------------- 2.94s 2025-09-23 19:13:32.821381 | orchestrator | memcached : Ensuring config directories exist --------------------------- 2.09s 2025-09-23 19:13:32.821393 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.70s 2025-09-23 19:13:32.821403 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.45s 2025-09-23 19:13:32.821414 | orchestrator | 2025-09-23 19:13:32.821424 | orchestrator | 2025-09-23 19:13:32.821435 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2025-09-23 19:13:32.821445 | orchestrator | 2025-09-23 19:13:32.821456 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2025-09-23 19:13:32.821467 | orchestrator | Tuesday 23 September 2025 19:12:37 +0000 (0:00:00.221) 0:00:00.221 ***** 2025-09-23 19:13:32.821478 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:32.821487 | orchestrator | 2025-09-23 19:13:32.821496 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2025-09-23 19:13:32.821506 | orchestrator | Tuesday 23 September 2025 19:12:38 +0000 (0:00:01.186) 0:00:01.407 ***** 2025-09-23 19:13:32.821531 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2025-09-23 19:13:32.821550 | orchestrator | 2025-09-23 19:13:32.821560 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2025-09-23 19:13:32.821570 | orchestrator | Tuesday 23 September 2025 19:12:39 +0000 (0:00:00.480) 0:00:01.888 ***** 2025-09-23 19:13:32.821579 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:32.821588 | orchestrator | 2025-09-23 19:13:32.821598 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2025-09-23 19:13:32.821607 | orchestrator | Tuesday 23 September 2025 19:12:40 +0000 (0:00:00.960) 0:00:02.848 ***** 2025-09-23 19:13:32.821617 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2025-09-23 19:13:32.821626 | orchestrator | ok: [testbed-manager] 2025-09-23 19:13:32.821635 | orchestrator | 2025-09-23 19:13:32.821645 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2025-09-23 19:13:32.821654 | orchestrator | Tuesday 23 September 2025 19:13:25 +0000 (0:00:44.821) 0:00:47.669 ***** 2025-09-23 19:13:32.821663 | orchestrator | changed: [testbed-manager] 2025-09-23 19:13:32.821673 | orchestrator | 2025-09-23 19:13:32.821682 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:13:32.821692 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.821701 | orchestrator | 2025-09-23 19:13:32.821711 | orchestrator | 2025-09-23 19:13:32.821720 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:13:32.821730 | orchestrator | Tuesday 23 September 2025 19:13:28 +0000 (0:00:03.513) 0:00:51.183 ***** 2025-09-23 19:13:32.821739 | orchestrator | =============================================================================== 2025-09-23 19:13:32.821748 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 44.82s 2025-09-23 19:13:32.821758 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 3.51s 2025-09-23 19:13:32.821767 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 1.19s 2025-09-23 19:13:32.821781 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 0.96s 2025-09-23 19:13:32.821791 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 0.48s 2025-09-23 19:13:32.821801 | orchestrator | 2025-09-23 19:13:32.821810 | orchestrator | 2025-09-23 19:13:32.821820 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:13:32.821829 | orchestrator | 2025-09-23 19:13:32.821839 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:13:32.821848 | orchestrator | Tuesday 23 September 2025 19:12:49 +0000 (0:00:01.099) 0:00:01.099 ***** 2025-09-23 19:13:32.821891 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:13:32.821900 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:13:32.821910 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:13:32.821920 | orchestrator | 2025-09-23 19:13:32.821929 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:13:32.821939 | orchestrator | Tuesday 23 September 2025 19:12:50 +0000 (0:00:01.042) 0:00:02.142 ***** 2025-09-23 19:13:32.821948 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2025-09-23 19:13:32.821958 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2025-09-23 19:13:32.821967 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2025-09-23 19:13:32.821976 | orchestrator | 2025-09-23 19:13:32.821986 | orchestrator | PLAY [Apply role redis] ******************************************************** 2025-09-23 19:13:32.821995 | orchestrator | 2025-09-23 19:13:32.822004 | orchestrator | TASK [redis : include_tasks] *************************************************** 2025-09-23 19:13:32.822079 | orchestrator | Tuesday 23 September 2025 19:12:52 +0000 (0:00:01.759) 0:00:03.902 ***** 2025-09-23 19:13:32.822092 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:13:32.822102 | orchestrator | 2025-09-23 19:13:32.822111 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2025-09-23 19:13:32.822128 | orchestrator | Tuesday 23 September 2025 19:12:54 +0000 (0:00:01.761) 0:00:05.663 ***** 2025-09-23 19:13:32.822140 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822155 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822174 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822186 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822201 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822212 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822222 | orchestrator | 2025-09-23 19:13:32.822232 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2025-09-23 19:13:32.822247 | orchestrator | Tuesday 23 September 2025 19:12:57 +0000 (0:00:03.104) 0:00:08.767 ***** 2025-09-23 19:13:32.822258 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822268 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822284 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822295 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822305 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822315 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822330 | orchestrator | 2025-09-23 19:13:32.822340 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2025-09-23 19:13:32.822350 | orchestrator | Tuesday 23 September 2025 19:13:04 +0000 (0:00:06.554) 0:00:15.321 ***** 2025-09-23 19:13:32.822360 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822370 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822385 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822402 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822412 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822426 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822441 | orchestrator | 2025-09-23 19:13:32.822451 | orchestrator | TASK [redis : Check redis containers] ****************************************** 2025-09-23 19:13:32.822461 | orchestrator | Tuesday 23 September 2025 19:13:10 +0000 (0:00:06.171) 0:00:21.493 ***** 2025-09-23 19:13:32.822471 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822481 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822491 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822506 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822517 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822530 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-23 19:13:32.822545 | orchestrator | 2025-09-23 19:13:32.822555 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-09-23 19:13:32.822565 | orchestrator | Tuesday 23 September 2025 19:13:12 +0000 (0:00:02.516) 0:00:24.009 ***** 2025-09-23 19:13:32.822575 | orchestrator | 2025-09-23 19:13:32.822584 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-09-23 19:13:32.822594 | orchestrator | Tuesday 23 September 2025 19:13:12 +0000 (0:00:00.094) 0:00:24.104 ***** 2025-09-23 19:13:32.822603 | orchestrator | 2025-09-23 19:13:32.822613 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-09-23 19:13:32.822622 | orchestrator | Tuesday 23 September 2025 19:13:13 +0000 (0:00:00.104) 0:00:24.208 ***** 2025-09-23 19:13:32.822632 | orchestrator | 2025-09-23 19:13:32.822641 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2025-09-23 19:13:32.822651 | orchestrator | Tuesday 23 September 2025 19:13:13 +0000 (0:00:00.057) 0:00:24.266 ***** 2025-09-23 19:13:32.822660 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:32.822670 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:32.822679 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:32.822689 | orchestrator | 2025-09-23 19:13:32.822698 | orchestrator | RUNNING HANDLER [redis : Restart redis-sentinel container] ********************* 2025-09-23 19:13:32.822708 | orchestrator | Tuesday 23 September 2025 19:13:27 +0000 (0:00:13.989) 0:00:38.255 ***** 2025-09-23 19:13:32.822717 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:13:32.822727 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:13:32.822736 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:13:32.822745 | orchestrator | 2025-09-23 19:13:32.822755 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:13:32.822765 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.822775 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.822785 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:13:32.822794 | orchestrator | 2025-09-23 19:13:32.822804 | orchestrator | 2025-09-23 19:13:32.822814 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:13:32.822823 | orchestrator | Tuesday 23 September 2025 19:13:30 +0000 (0:00:03.147) 0:00:41.403 ***** 2025-09-23 19:13:32.822833 | orchestrator | =============================================================================== 2025-09-23 19:13:32.822842 | orchestrator | redis : Restart redis container ---------------------------------------- 13.99s 2025-09-23 19:13:32.822895 | orchestrator | redis : Copying over default config.json files -------------------------- 6.55s 2025-09-23 19:13:32.822906 | orchestrator | redis : Copying over redis config files --------------------------------- 6.17s 2025-09-23 19:13:32.822916 | orchestrator | redis : Restart redis-sentinel container -------------------------------- 3.15s 2025-09-23 19:13:32.822925 | orchestrator | redis : Ensuring config directories exist ------------------------------- 3.10s 2025-09-23 19:13:32.822935 | orchestrator | redis : Check redis containers ------------------------------------------ 2.52s 2025-09-23 19:13:32.822950 | orchestrator | redis : include_tasks --------------------------------------------------- 1.76s 2025-09-23 19:13:32.822960 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.76s 2025-09-23 19:13:32.822969 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.04s 2025-09-23 19:13:32.822979 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.26s 2025-09-23 19:13:32.822989 | orchestrator | 2025-09-23 19:13:32 | INFO  | Task ff5bfa95-bdd3-426d-a830-f1a37cd050f3 is in state SUCCESS 2025-09-23 19:13:32.822998 | orchestrator | 2025-09-23 19:13:32 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:32.823015 | orchestrator | 2025-09-23 19:13:32 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:32.823024 | orchestrator | 2025-09-23 19:13:32 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:32.823034 | orchestrator | 2025-09-23 19:13:32 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:32.823044 | orchestrator | 2025-09-23 19:13:32 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:32.823053 | orchestrator | 2025-09-23 19:13:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:35.737429 | orchestrator | 2025-09-23 19:13:35 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:35.737973 | orchestrator | 2025-09-23 19:13:35 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:35.739176 | orchestrator | 2025-09-23 19:13:35 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:35.739896 | orchestrator | 2025-09-23 19:13:35 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:35.740693 | orchestrator | 2025-09-23 19:13:35 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:35.740793 | orchestrator | 2025-09-23 19:13:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:38.772411 | orchestrator | 2025-09-23 19:13:38 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:38.772475 | orchestrator | 2025-09-23 19:13:38 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:38.773484 | orchestrator | 2025-09-23 19:13:38 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:38.773978 | orchestrator | 2025-09-23 19:13:38 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:38.775433 | orchestrator | 2025-09-23 19:13:38 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:38.775453 | orchestrator | 2025-09-23 19:13:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:41.819074 | orchestrator | 2025-09-23 19:13:41 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:41.819618 | orchestrator | 2025-09-23 19:13:41 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:41.820940 | orchestrator | 2025-09-23 19:13:41 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:41.822099 | orchestrator | 2025-09-23 19:13:41 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:41.823330 | orchestrator | 2025-09-23 19:13:41 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:41.823353 | orchestrator | 2025-09-23 19:13:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:44.849759 | orchestrator | 2025-09-23 19:13:44 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:44.850849 | orchestrator | 2025-09-23 19:13:44 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:44.851742 | orchestrator | 2025-09-23 19:13:44 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:44.852659 | orchestrator | 2025-09-23 19:13:44 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:44.853544 | orchestrator | 2025-09-23 19:13:44 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:44.853600 | orchestrator | 2025-09-23 19:13:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:47.942532 | orchestrator | 2025-09-23 19:13:47 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:47.943409 | orchestrator | 2025-09-23 19:13:47 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:47.945654 | orchestrator | 2025-09-23 19:13:47 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:47.948267 | orchestrator | 2025-09-23 19:13:47 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:47.949352 | orchestrator | 2025-09-23 19:13:47 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:47.950466 | orchestrator | 2025-09-23 19:13:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:50.996232 | orchestrator | 2025-09-23 19:13:50 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:50.997831 | orchestrator | 2025-09-23 19:13:50 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:50.999232 | orchestrator | 2025-09-23 19:13:50 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:50.999262 | orchestrator | 2025-09-23 19:13:50 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:51.000620 | orchestrator | 2025-09-23 19:13:51 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:51.000645 | orchestrator | 2025-09-23 19:13:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:54.037162 | orchestrator | 2025-09-23 19:13:54 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:54.037326 | orchestrator | 2025-09-23 19:13:54 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:54.038885 | orchestrator | 2025-09-23 19:13:54 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:54.039753 | orchestrator | 2025-09-23 19:13:54 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:54.040371 | orchestrator | 2025-09-23 19:13:54 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:54.040550 | orchestrator | 2025-09-23 19:13:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:13:57.073099 | orchestrator | 2025-09-23 19:13:57 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:13:57.073358 | orchestrator | 2025-09-23 19:13:57 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:13:57.074323 | orchestrator | 2025-09-23 19:13:57 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:13:57.075330 | orchestrator | 2025-09-23 19:13:57 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:13:57.076329 | orchestrator | 2025-09-23 19:13:57 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:13:57.076353 | orchestrator | 2025-09-23 19:13:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:00.120026 | orchestrator | 2025-09-23 19:14:00 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:00.120544 | orchestrator | 2025-09-23 19:14:00 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state STARTED 2025-09-23 19:14:00.123074 | orchestrator | 2025-09-23 19:14:00 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:00.124012 | orchestrator | 2025-09-23 19:14:00 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:00.125051 | orchestrator | 2025-09-23 19:14:00 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:00.125074 | orchestrator | 2025-09-23 19:14:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:03.169393 | orchestrator | 2025-09-23 19:14:03 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:03.172236 | orchestrator | 2025-09-23 19:14:03 | INFO  | Task 563ffab1-53c1-492a-8dd0-05d568e8c4db is in state SUCCESS 2025-09-23 19:14:03.174203 | orchestrator | 2025-09-23 19:14:03.174241 | orchestrator | 2025-09-23 19:14:03.174250 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:14:03.174260 | orchestrator | 2025-09-23 19:14:03.174269 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:14:03.174278 | orchestrator | Tuesday 23 September 2025 19:12:51 +0000 (0:00:00.526) 0:00:00.526 ***** 2025-09-23 19:14:03.174288 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:14:03.174298 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:14:03.174307 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:14:03.174316 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:14:03.174324 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:14:03.174333 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:14:03.174341 | orchestrator | 2025-09-23 19:14:03.174350 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:14:03.174359 | orchestrator | Tuesday 23 September 2025 19:12:53 +0000 (0:00:02.132) 0:00:02.659 ***** 2025-09-23 19:14:03.174367 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-23 19:14:03.174376 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-23 19:14:03.174385 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-23 19:14:03.174393 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-23 19:14:03.174401 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-23 19:14:03.174410 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-23 19:14:03.174419 | orchestrator | 2025-09-23 19:14:03.174427 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2025-09-23 19:14:03.174436 | orchestrator | 2025-09-23 19:14:03.174444 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2025-09-23 19:14:03.174452 | orchestrator | Tuesday 23 September 2025 19:12:55 +0000 (0:00:02.439) 0:00:05.099 ***** 2025-09-23 19:14:03.174462 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:14:03.174534 | orchestrator | 2025-09-23 19:14:03.174547 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-09-23 19:14:03.174556 | orchestrator | Tuesday 23 September 2025 19:12:58 +0000 (0:00:03.361) 0:00:08.460 ***** 2025-09-23 19:14:03.174565 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-09-23 19:14:03.174574 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-09-23 19:14:03.174583 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-09-23 19:14:03.174591 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-09-23 19:14:03.174600 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-09-23 19:14:03.174622 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-09-23 19:14:03.174631 | orchestrator | 2025-09-23 19:14:03.174640 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-09-23 19:14:03.174649 | orchestrator | Tuesday 23 September 2025 19:13:02 +0000 (0:00:03.632) 0:00:12.093 ***** 2025-09-23 19:14:03.174676 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-09-23 19:14:03.174685 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-09-23 19:14:03.174694 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-09-23 19:14:03.174702 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-09-23 19:14:03.174711 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-09-23 19:14:03.174719 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-09-23 19:14:03.174728 | orchestrator | 2025-09-23 19:14:03.174736 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-09-23 19:14:03.174745 | orchestrator | Tuesday 23 September 2025 19:13:07 +0000 (0:00:04.631) 0:00:16.724 ***** 2025-09-23 19:14:03.174753 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2025-09-23 19:14:03.174762 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:14:03.174771 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2025-09-23 19:14:03.174804 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2025-09-23 19:14:03.174812 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2025-09-23 19:14:03.174821 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:14:03.174830 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:14:03.174838 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:14:03.174847 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2025-09-23 19:14:03.174855 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:14:03.174864 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2025-09-23 19:14:03.174872 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:14:03.174881 | orchestrator | 2025-09-23 19:14:03.174890 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2025-09-23 19:14:03.174898 | orchestrator | Tuesday 23 September 2025 19:13:09 +0000 (0:00:01.851) 0:00:18.575 ***** 2025-09-23 19:14:03.174907 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:14:03.174915 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:14:03.174924 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:14:03.174932 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:14:03.174941 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:14:03.174949 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:14:03.174958 | orchestrator | 2025-09-23 19:14:03.174967 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2025-09-23 19:14:03.174975 | orchestrator | Tuesday 23 September 2025 19:13:10 +0000 (0:00:01.248) 0:00:19.824 ***** 2025-09-23 19:14:03.175001 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175014 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175030 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175044 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175054 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175063 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175078 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175088 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175107 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175116 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175125 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175140 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175149 | orchestrator | 2025-09-23 19:14:03.175158 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2025-09-23 19:14:03.175167 | orchestrator | Tuesday 23 September 2025 19:13:12 +0000 (0:00:02.422) 0:00:22.247 ***** 2025-09-23 19:14:03.175177 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175193 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175207 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175218 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175229 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175252 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175263 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175280 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175295 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175305 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175315 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175332 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175342 | orchestrator | 2025-09-23 19:14:03.175352 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2025-09-23 19:14:03.175370 | orchestrator | Tuesday 23 September 2025 19:13:16 +0000 (0:00:03.491) 0:00:25.738 ***** 2025-09-23 19:14:03.175380 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:14:03.175389 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:14:03.175399 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:14:03.175409 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:14:03.175418 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:14:03.175429 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:14:03.175437 | orchestrator | 2025-09-23 19:14:03.175446 | orchestrator | TASK [openvswitch : Check openvswitch containers] ****************************** 2025-09-23 19:14:03.175455 | orchestrator | Tuesday 23 September 2025 19:13:17 +0000 (0:00:01.284) 0:00:27.023 ***** 2025-09-23 19:14:03.175464 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175473 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175488 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175498 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175512 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175528 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175538 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175551 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175560 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175569 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175596 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175606 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-23 19:14:03.175615 | orchestrator | 2025-09-23 19:14:03.175623 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-23 19:14:03.175632 | orchestrator | Tuesday 23 September 2025 19:13:19 +0000 (0:00:02.268) 0:00:29.292 ***** 2025-09-23 19:14:03.175641 | orchestrator | 2025-09-23 19:14:03.175650 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-23 19:14:03.175658 | orchestrator | Tuesday 23 September 2025 19:13:20 +0000 (0:00:00.276) 0:00:29.568 ***** 2025-09-23 19:14:03.175667 | orchestrator | 2025-09-23 19:14:03.175675 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-23 19:14:03.175684 | orchestrator | Tuesday 23 September 2025 19:13:20 +0000 (0:00:00.135) 0:00:29.703 ***** 2025-09-23 19:14:03.175693 | orchestrator | 2025-09-23 19:14:03.175701 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-23 19:14:03.175710 | orchestrator | Tuesday 23 September 2025 19:13:20 +0000 (0:00:00.238) 0:00:29.942 ***** 2025-09-23 19:14:03.175718 | orchestrator | 2025-09-23 19:14:03.175727 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-23 19:14:03.175739 | orchestrator | Tuesday 23 September 2025 19:13:20 +0000 (0:00:00.291) 0:00:30.234 ***** 2025-09-23 19:14:03.175748 | orchestrator | 2025-09-23 19:14:03.175757 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-23 19:14:03.175766 | orchestrator | Tuesday 23 September 2025 19:13:20 +0000 (0:00:00.216) 0:00:30.450 ***** 2025-09-23 19:14:03.175790 | orchestrator | 2025-09-23 19:14:03.175799 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2025-09-23 19:14:03.175807 | orchestrator | Tuesday 23 September 2025 19:13:21 +0000 (0:00:00.133) 0:00:30.583 ***** 2025-09-23 19:14:03.175816 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:14:03.175825 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:14:03.175834 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:14:03.175842 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:14:03.175851 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:14:03.175860 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:14:03.175868 | orchestrator | 2025-09-23 19:14:03.175877 | orchestrator | RUNNING HANDLER [openvswitch : Waiting for openvswitch_db service to be ready] *** 2025-09-23 19:14:03.175886 | orchestrator | Tuesday 23 September 2025 19:13:45 +0000 (0:00:24.526) 0:00:55.110 ***** 2025-09-23 19:14:03.175894 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:14:03.175903 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:14:03.175912 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:14:03.175920 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:14:03.175929 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:14:03.175944 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:14:03.175953 | orchestrator | 2025-09-23 19:14:03.175961 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2025-09-23 19:14:03.175970 | orchestrator | Tuesday 23 September 2025 19:13:47 +0000 (0:00:01.849) 0:00:56.959 ***** 2025-09-23 19:14:03.175979 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:14:03.175988 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:14:03.175996 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:14:03.176005 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:14:03.176013 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:14:03.176022 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:14:03.176030 | orchestrator | 2025-09-23 19:14:03.176039 | orchestrator | TASK [openvswitch : Set system-id, hostname and hw-offload] ******************** 2025-09-23 19:14:03.176048 | orchestrator | Tuesday 23 September 2025 19:13:59 +0000 (0:00:11.576) 0:01:08.536 ***** 2025-09-23 19:14:03.176058 | orchestrator | failed: [testbed-node-0] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-0'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-0"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176071 | orchestrator | failed: [testbed-node-5] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-5'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-5"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176171 | orchestrator | failed: [testbed-node-3] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-3'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-3"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176185 | orchestrator | failed: [testbed-node-4] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-4'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-4"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176195 | orchestrator | failed: [testbed-node-1] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-1'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-1"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176203 | orchestrator | failed: [testbed-node-2] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-2'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-2"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176212 | orchestrator | failed: [testbed-node-0] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-0'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-0"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176221 | orchestrator | failed: [testbed-node-3] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-3'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-3"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176235 | orchestrator | failed: [testbed-node-4] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-4'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-4"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176244 | orchestrator | failed: [testbed-node-5] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-5'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-5"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176260 | orchestrator | failed: [testbed-node-1] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-1'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-1"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176269 | orchestrator | failed: [testbed-node-2] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-2'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-2"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176281 | orchestrator | failed: [testbed-node-5] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176291 | orchestrator | failed: [testbed-node-4] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176300 | orchestrator | failed: [testbed-node-3] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176315 | orchestrator | failed: [testbed-node-1] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176325 | orchestrator | failed: [testbed-node-0] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176334 | orchestrator | failed: [testbed-node-2] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:03.176343 | orchestrator | 2025-09-23 19:14:03.176351 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:14:03.176360 | orchestrator | testbed-node-0 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-23 19:14:03.176370 | orchestrator | testbed-node-1 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-23 19:14:03.176379 | orchestrator | testbed-node-2 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-23 19:14:03.176387 | orchestrator | testbed-node-3 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-23 19:14:03.176396 | orchestrator | testbed-node-4 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-23 19:14:03.176404 | orchestrator | testbed-node-5 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-23 19:14:03.176419 | orchestrator | 2025-09-23 19:14:03.176427 | orchestrator | 2025-09-23 19:14:03.176436 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:14:03.176445 | orchestrator | Tuesday 23 September 2025 19:14:02 +0000 (0:00:03.305) 0:01:11.841 ***** 2025-09-23 19:14:03.176458 | orchestrator | =============================================================================== 2025-09-23 19:14:03.176467 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------ 24.53s 2025-09-23 19:14:03.176476 | orchestrator | openvswitch : Restart openvswitch-vswitchd container ------------------- 11.58s 2025-09-23 19:14:03.176484 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 4.63s 2025-09-23 19:14:03.176493 | orchestrator | module-load : Load modules ---------------------------------------------- 3.63s 2025-09-23 19:14:03.176501 | orchestrator | openvswitch : Copying over config.json files for services --------------- 3.49s 2025-09-23 19:14:03.176510 | orchestrator | openvswitch : include_tasks --------------------------------------------- 3.36s 2025-09-23 19:14:03.176519 | orchestrator | openvswitch : Set system-id, hostname and hw-offload -------------------- 3.31s 2025-09-23 19:14:03.176527 | orchestrator | Group hosts based on enabled services ----------------------------------- 2.44s 2025-09-23 19:14:03.176536 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 2.42s 2025-09-23 19:14:03.176544 | orchestrator | openvswitch : Check openvswitch containers ------------------------------ 2.27s 2025-09-23 19:14:03.176553 | orchestrator | Group hosts based on Kolla action --------------------------------------- 2.13s 2025-09-23 19:14:03.176561 | orchestrator | module-load : Drop module persistence ----------------------------------- 1.85s 2025-09-23 19:14:03.176570 | orchestrator | openvswitch : Waiting for openvswitch_db service to be ready ------------ 1.85s 2025-09-23 19:14:03.176578 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 1.29s 2025-09-23 19:14:03.176587 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 1.28s 2025-09-23 19:14:03.176595 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 1.25s 2025-09-23 19:14:03.176604 | orchestrator | 2025-09-23 19:14:03 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:03.176612 | orchestrator | 2025-09-23 19:14:03 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:03.177732 | orchestrator | 2025-09-23 19:14:03 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:03.177748 | orchestrator | 2025-09-23 19:14:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:06.223043 | orchestrator | 2025-09-23 19:14:06 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:06.223146 | orchestrator | 2025-09-23 19:14:06 | INFO  | Task 81b4f79b-76b8-4731-af5d-80f75f5cdffa is in state STARTED 2025-09-23 19:14:06.223648 | orchestrator | 2025-09-23 19:14:06 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:06.224608 | orchestrator | 2025-09-23 19:14:06 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:06.226276 | orchestrator | 2025-09-23 19:14:06 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:06.226312 | orchestrator | 2025-09-23 19:14:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:09.268335 | orchestrator | 2025-09-23 19:14:09 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:09.269400 | orchestrator | 2025-09-23 19:14:09 | INFO  | Task 81b4f79b-76b8-4731-af5d-80f75f5cdffa is in state STARTED 2025-09-23 19:14:09.273041 | orchestrator | 2025-09-23 19:14:09 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:09.274652 | orchestrator | 2025-09-23 19:14:09 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:09.277522 | orchestrator | 2025-09-23 19:14:09 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:09.277578 | orchestrator | 2025-09-23 19:14:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:12.313168 | orchestrator | 2025-09-23 19:14:12 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:12.313531 | orchestrator | 2025-09-23 19:14:12 | INFO  | Task 81b4f79b-76b8-4731-af5d-80f75f5cdffa is in state STARTED 2025-09-23 19:14:12.314126 | orchestrator | 2025-09-23 19:14:12 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:12.315044 | orchestrator | 2025-09-23 19:14:12 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:12.318636 | orchestrator | 2025-09-23 19:14:12 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:12.318728 | orchestrator | 2025-09-23 19:14:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:15.361354 | orchestrator | 2025-09-23 19:14:15 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:15.363435 | orchestrator | 2025-09-23 19:14:15 | INFO  | Task 81b4f79b-76b8-4731-af5d-80f75f5cdffa is in state STARTED 2025-09-23 19:14:15.365302 | orchestrator | 2025-09-23 19:14:15 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:15.367963 | orchestrator | 2025-09-23 19:14:15 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:15.370586 | orchestrator | 2025-09-23 19:14:15 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:15.370602 | orchestrator | 2025-09-23 19:14:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:18.402385 | orchestrator | 2025-09-23 19:14:18 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:18.404841 | orchestrator | 2025-09-23 19:14:18 | INFO  | Task 81b4f79b-76b8-4731-af5d-80f75f5cdffa is in state STARTED 2025-09-23 19:14:18.409560 | orchestrator | 2025-09-23 19:14:18 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:18.411506 | orchestrator | 2025-09-23 19:14:18 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:18.412915 | orchestrator | 2025-09-23 19:14:18 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:18.412950 | orchestrator | 2025-09-23 19:14:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:21.452074 | orchestrator | 2025-09-23 19:14:21 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:21.453103 | orchestrator | 2025-09-23 19:14:21 | INFO  | Task 81b4f79b-76b8-4731-af5d-80f75f5cdffa is in state SUCCESS 2025-09-23 19:14:21.454227 | orchestrator | 2025-09-23 19:14:21.454263 | orchestrator | 2025-09-23 19:14:21.454275 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:14:21.454286 | orchestrator | 2025-09-23 19:14:21.454298 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:14:21.454309 | orchestrator | Tuesday 23 September 2025 19:14:06 +0000 (0:00:00.179) 0:00:00.179 ***** 2025-09-23 19:14:21.454320 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:14:21.454332 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:14:21.454343 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:14:21.454354 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:14:21.454364 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:14:21.454375 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:14:21.454413 | orchestrator | 2025-09-23 19:14:21.454424 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:14:21.454435 | orchestrator | Tuesday 23 September 2025 19:14:07 +0000 (0:00:00.951) 0:00:01.130 ***** 2025-09-23 19:14:21.454445 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2025-09-23 19:14:21.454456 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2025-09-23 19:14:21.454467 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2025-09-23 19:14:21.454478 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2025-09-23 19:14:21.454488 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2025-09-23 19:14:21.454499 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2025-09-23 19:14:21.454510 | orchestrator | 2025-09-23 19:14:21.454520 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2025-09-23 19:14:21.454531 | orchestrator | 2025-09-23 19:14:21.454541 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2025-09-23 19:14:21.454552 | orchestrator | Tuesday 23 September 2025 19:14:09 +0000 (0:00:01.527) 0:00:02.658 ***** 2025-09-23 19:14:21.454564 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:14:21.454577 | orchestrator | 2025-09-23 19:14:21.454587 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2025-09-23 19:14:21.454598 | orchestrator | Tuesday 23 September 2025 19:14:10 +0000 (0:00:01.174) 0:00:03.832 ***** 2025-09-23 19:14:21.454611 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454626 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454651 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454663 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454675 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454686 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454706 | orchestrator | 2025-09-23 19:14:21.454729 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2025-09-23 19:14:21.454741 | orchestrator | Tuesday 23 September 2025 19:14:11 +0000 (0:00:01.107) 0:00:04.939 ***** 2025-09-23 19:14:21.454753 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454787 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454798 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454809 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454820 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454838 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454850 | orchestrator | 2025-09-23 19:14:21.454864 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2025-09-23 19:14:21.454876 | orchestrator | Tuesday 23 September 2025 19:14:13 +0000 (0:00:01.421) 0:00:06.361 ***** 2025-09-23 19:14:21.454889 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454901 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454929 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454942 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454955 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454967 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.454979 | orchestrator | 2025-09-23 19:14:21.454990 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2025-09-23 19:14:21.455001 | orchestrator | Tuesday 23 September 2025 19:14:14 +0000 (0:00:01.079) 0:00:07.440 ***** 2025-09-23 19:14:21.455012 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455023 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455039 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455051 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455068 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455079 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455090 | orchestrator | 2025-09-23 19:14:21.455106 | orchestrator | TASK [ovn-controller : Check ovn-controller containers] ************************ 2025-09-23 19:14:21.455117 | orchestrator | Tuesday 23 September 2025 19:14:15 +0000 (0:00:01.542) 0:00:08.982 ***** 2025-09-23 19:14:21.455128 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455139 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455151 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455162 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455172 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455188 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:14:21.455199 | orchestrator | 2025-09-23 19:14:21.455211 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2025-09-23 19:14:21.455228 | orchestrator | Tuesday 23 September 2025 19:14:17 +0000 (0:00:01.270) 0:00:10.253 ***** 2025-09-23 19:14:21.455239 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:21.455250 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:21.455260 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:21.455271 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:21.455281 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:21.455292 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:14:21.455302 | orchestrator | 2025-09-23 19:14:21.455313 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:14:21.455324 | orchestrator | testbed-node-0 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:14:21.455336 | orchestrator | testbed-node-1 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:14:21.455352 | orchestrator | testbed-node-2 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:14:21.455364 | orchestrator | testbed-node-3 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:14:21.455379 | orchestrator | testbed-node-4 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:14:21.455390 | orchestrator | testbed-node-5 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:14:21.455401 | orchestrator | 2025-09-23 19:14:21.455411 | orchestrator | 2025-09-23 19:14:21.455422 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:14:21.455433 | orchestrator | Tuesday 23 September 2025 19:14:18 +0000 (0:00:01.563) 0:00:11.817 ***** 2025-09-23 19:14:21.455444 | orchestrator | =============================================================================== 2025-09-23 19:14:21.455454 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 1.56s 2025-09-23 19:14:21.455465 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 1.54s 2025-09-23 19:14:21.455476 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.53s 2025-09-23 19:14:21.455487 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 1.42s 2025-09-23 19:14:21.455497 | orchestrator | ovn-controller : Check ovn-controller containers ------------------------ 1.27s 2025-09-23 19:14:21.455508 | orchestrator | ovn-controller : include_tasks ------------------------------------------ 1.17s 2025-09-23 19:14:21.455518 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.11s 2025-09-23 19:14:21.455529 | orchestrator | ovn-controller : Ensuring systemd override directory exists ------------- 1.08s 2025-09-23 19:14:21.455540 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.95s 2025-09-23 19:14:21.457033 | orchestrator | 2025-09-23 19:14:21 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:21.459001 | orchestrator | 2025-09-23 19:14:21 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:21.462904 | orchestrator | 2025-09-23 19:14:21 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:21.462952 | orchestrator | 2025-09-23 19:14:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:24.494879 | orchestrator | 2025-09-23 19:14:24 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:24.494982 | orchestrator | 2025-09-23 19:14:24 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:24.495921 | orchestrator | 2025-09-23 19:14:24 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:24.498147 | orchestrator | 2025-09-23 19:14:24 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:24.498174 | orchestrator | 2025-09-23 19:14:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:27.537136 | orchestrator | 2025-09-23 19:14:27 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:27.537704 | orchestrator | 2025-09-23 19:14:27 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:27.538409 | orchestrator | 2025-09-23 19:14:27 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:27.539442 | orchestrator | 2025-09-23 19:14:27 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:27.539466 | orchestrator | 2025-09-23 19:14:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:30.581737 | orchestrator | 2025-09-23 19:14:30 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:30.582323 | orchestrator | 2025-09-23 19:14:30 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:30.583195 | orchestrator | 2025-09-23 19:14:30 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:30.583962 | orchestrator | 2025-09-23 19:14:30 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:30.584215 | orchestrator | 2025-09-23 19:14:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:33.610688 | orchestrator | 2025-09-23 19:14:33 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:33.611314 | orchestrator | 2025-09-23 19:14:33 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:33.612781 | orchestrator | 2025-09-23 19:14:33 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:33.613503 | orchestrator | 2025-09-23 19:14:33 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:33.613528 | orchestrator | 2025-09-23 19:14:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:36.641113 | orchestrator | 2025-09-23 19:14:36 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:36.641202 | orchestrator | 2025-09-23 19:14:36 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:36.641896 | orchestrator | 2025-09-23 19:14:36 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:36.642611 | orchestrator | 2025-09-23 19:14:36 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:36.642643 | orchestrator | 2025-09-23 19:14:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:39.710785 | orchestrator | 2025-09-23 19:14:39 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:39.716224 | orchestrator | 2025-09-23 19:14:39 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:39.719985 | orchestrator | 2025-09-23 19:14:39 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:39.721185 | orchestrator | 2025-09-23 19:14:39 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:39.721217 | orchestrator | 2025-09-23 19:14:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:42.762381 | orchestrator | 2025-09-23 19:14:42 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:42.764287 | orchestrator | 2025-09-23 19:14:42 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:42.767178 | orchestrator | 2025-09-23 19:14:42 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:42.772466 | orchestrator | 2025-09-23 19:14:42 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:42.772900 | orchestrator | 2025-09-23 19:14:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:45.814047 | orchestrator | 2025-09-23 19:14:45 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:45.814173 | orchestrator | 2025-09-23 19:14:45 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:45.814181 | orchestrator | 2025-09-23 19:14:45 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:45.814193 | orchestrator | 2025-09-23 19:14:45 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:45.814289 | orchestrator | 2025-09-23 19:14:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:48.858276 | orchestrator | 2025-09-23 19:14:48 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:48.859192 | orchestrator | 2025-09-23 19:14:48 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:48.861051 | orchestrator | 2025-09-23 19:14:48 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:48.863085 | orchestrator | 2025-09-23 19:14:48 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:48.863124 | orchestrator | 2025-09-23 19:14:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:51.900163 | orchestrator | 2025-09-23 19:14:51 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:51.905836 | orchestrator | 2025-09-23 19:14:51 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:51.909552 | orchestrator | 2025-09-23 19:14:51 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:51.914466 | orchestrator | 2025-09-23 19:14:51 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:51.914563 | orchestrator | 2025-09-23 19:14:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:54.958361 | orchestrator | 2025-09-23 19:14:54 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:54.958466 | orchestrator | 2025-09-23 19:14:54 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:54.960205 | orchestrator | 2025-09-23 19:14:54 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:54.962173 | orchestrator | 2025-09-23 19:14:54 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:54.962210 | orchestrator | 2025-09-23 19:14:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:14:58.245901 | orchestrator | 2025-09-23 19:14:58 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:14:58.246012 | orchestrator | 2025-09-23 19:14:58 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:14:58.246748 | orchestrator | 2025-09-23 19:14:58 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:14:58.248497 | orchestrator | 2025-09-23 19:14:58 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:14:58.248520 | orchestrator | 2025-09-23 19:14:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:01.280840 | orchestrator | 2025-09-23 19:15:01 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:01.282742 | orchestrator | 2025-09-23 19:15:01 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:01.282778 | orchestrator | 2025-09-23 19:15:01 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:01.282790 | orchestrator | 2025-09-23 19:15:01 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:01.282802 | orchestrator | 2025-09-23 19:15:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:04.574084 | orchestrator | 2025-09-23 19:15:04 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:04.574172 | orchestrator | 2025-09-23 19:15:04 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:04.574187 | orchestrator | 2025-09-23 19:15:04 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:04.574199 | orchestrator | 2025-09-23 19:15:04 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:04.574211 | orchestrator | 2025-09-23 19:15:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:07.575120 | orchestrator | 2025-09-23 19:15:07 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:07.575210 | orchestrator | 2025-09-23 19:15:07 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:07.575226 | orchestrator | 2025-09-23 19:15:07 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:07.575238 | orchestrator | 2025-09-23 19:15:07 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:07.575250 | orchestrator | 2025-09-23 19:15:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:10.602550 | orchestrator | 2025-09-23 19:15:10 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:10.603909 | orchestrator | 2025-09-23 19:15:10 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:10.606270 | orchestrator | 2025-09-23 19:15:10 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:10.607823 | orchestrator | 2025-09-23 19:15:10 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:10.607852 | orchestrator | 2025-09-23 19:15:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:13.637598 | orchestrator | 2025-09-23 19:15:13 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:13.638094 | orchestrator | 2025-09-23 19:15:13 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:13.638904 | orchestrator | 2025-09-23 19:15:13 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:13.640324 | orchestrator | 2025-09-23 19:15:13 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:13.640386 | orchestrator | 2025-09-23 19:15:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:16.678763 | orchestrator | 2025-09-23 19:15:16 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:16.678895 | orchestrator | 2025-09-23 19:15:16 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:16.679433 | orchestrator | 2025-09-23 19:15:16 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:16.681490 | orchestrator | 2025-09-23 19:15:16 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:16.681525 | orchestrator | 2025-09-23 19:15:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:19.716089 | orchestrator | 2025-09-23 19:15:19 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:19.716176 | orchestrator | 2025-09-23 19:15:19 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:19.717486 | orchestrator | 2025-09-23 19:15:19 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:19.718465 | orchestrator | 2025-09-23 19:15:19 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:19.718492 | orchestrator | 2025-09-23 19:15:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:22.746431 | orchestrator | 2025-09-23 19:15:22 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:22.750153 | orchestrator | 2025-09-23 19:15:22 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:22.751731 | orchestrator | 2025-09-23 19:15:22 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:22.751761 | orchestrator | 2025-09-23 19:15:22 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:22.751774 | orchestrator | 2025-09-23 19:15:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:25.784792 | orchestrator | 2025-09-23 19:15:25 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:25.785614 | orchestrator | 2025-09-23 19:15:25 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:25.787087 | orchestrator | 2025-09-23 19:15:25 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:25.788126 | orchestrator | 2025-09-23 19:15:25 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:25.788167 | orchestrator | 2025-09-23 19:15:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:28.824029 | orchestrator | 2025-09-23 19:15:28 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:28.825265 | orchestrator | 2025-09-23 19:15:28 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:28.827542 | orchestrator | 2025-09-23 19:15:28 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:28.830351 | orchestrator | 2025-09-23 19:15:28 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:28.830376 | orchestrator | 2025-09-23 19:15:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:31.867015 | orchestrator | 2025-09-23 19:15:31 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:31.867867 | orchestrator | 2025-09-23 19:15:31 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:31.870835 | orchestrator | 2025-09-23 19:15:31 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:31.874796 | orchestrator | 2025-09-23 19:15:31 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:31.878081 | orchestrator | 2025-09-23 19:15:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:34.929236 | orchestrator | 2025-09-23 19:15:34 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:34.929331 | orchestrator | 2025-09-23 19:15:34 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:34.930731 | orchestrator | 2025-09-23 19:15:34 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:34.931236 | orchestrator | 2025-09-23 19:15:34 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:34.931340 | orchestrator | 2025-09-23 19:15:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:37.967882 | orchestrator | 2025-09-23 19:15:37 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:37.970068 | orchestrator | 2025-09-23 19:15:37 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:37.970997 | orchestrator | 2025-09-23 19:15:37 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:37.973466 | orchestrator | 2025-09-23 19:15:37 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:37.974787 | orchestrator | 2025-09-23 19:15:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:41.020000 | orchestrator | 2025-09-23 19:15:41 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:41.020097 | orchestrator | 2025-09-23 19:15:41 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:41.021117 | orchestrator | 2025-09-23 19:15:41 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:41.041569 | orchestrator | 2025-09-23 19:15:41 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:41.041615 | orchestrator | 2025-09-23 19:15:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:44.083991 | orchestrator | 2025-09-23 19:15:44 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:44.086729 | orchestrator | 2025-09-23 19:15:44 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:44.086765 | orchestrator | 2025-09-23 19:15:44 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:44.086776 | orchestrator | 2025-09-23 19:15:44 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:44.086786 | orchestrator | 2025-09-23 19:15:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:47.117005 | orchestrator | 2025-09-23 19:15:47 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:47.117088 | orchestrator | 2025-09-23 19:15:47 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:47.117103 | orchestrator | 2025-09-23 19:15:47 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:47.119696 | orchestrator | 2025-09-23 19:15:47 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state STARTED 2025-09-23 19:15:47.119727 | orchestrator | 2025-09-23 19:15:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:50.150812 | orchestrator | 2025-09-23 19:15:50 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:50.151031 | orchestrator | 2025-09-23 19:15:50 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:50.151626 | orchestrator | 2025-09-23 19:15:50 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state STARTED 2025-09-23 19:15:50.152329 | orchestrator | 2025-09-23 19:15:50 | INFO  | Task 090568e2-6305-4a8d-8f14-6b982b5739f2 is in state SUCCESS 2025-09-23 19:15:50.153424 | orchestrator | 2025-09-23 19:15:50.153452 | orchestrator | 2025-09-23 19:15:50.153464 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2025-09-23 19:15:50.153475 | orchestrator | 2025-09-23 19:15:50.153486 | orchestrator | TASK [Inform the user about the following task] ******************************** 2025-09-23 19:15:50.153498 | orchestrator | Tuesday 23 September 2025 19:13:32 +0000 (0:00:00.103) 0:00:00.103 ***** 2025-09-23 19:15:50.153509 | orchestrator | ok: [localhost] => { 2025-09-23 19:15:50.153521 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2025-09-23 19:15:50.153532 | orchestrator | } 2025-09-23 19:15:50.153543 | orchestrator | 2025-09-23 19:15:50.153554 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2025-09-23 19:15:50.153578 | orchestrator | Tuesday 23 September 2025 19:13:32 +0000 (0:00:00.047) 0:00:00.150 ***** 2025-09-23 19:15:50.153590 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2025-09-23 19:15:50.153603 | orchestrator | ...ignoring 2025-09-23 19:15:50.153615 | orchestrator | 2025-09-23 19:15:50.153626 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2025-09-23 19:15:50.153663 | orchestrator | Tuesday 23 September 2025 19:13:35 +0000 (0:00:02.814) 0:00:02.965 ***** 2025-09-23 19:15:50.153674 | orchestrator | skipping: [localhost] 2025-09-23 19:15:50.153684 | orchestrator | 2025-09-23 19:15:50.153695 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2025-09-23 19:15:50.153706 | orchestrator | Tuesday 23 September 2025 19:13:35 +0000 (0:00:00.056) 0:00:03.021 ***** 2025-09-23 19:15:50.153717 | orchestrator | ok: [localhost] 2025-09-23 19:15:50.153728 | orchestrator | 2025-09-23 19:15:50.153738 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:15:50.153749 | orchestrator | 2025-09-23 19:15:50.153760 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:15:50.153771 | orchestrator | Tuesday 23 September 2025 19:13:35 +0000 (0:00:00.148) 0:00:03.170 ***** 2025-09-23 19:15:50.153781 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:50.153792 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:50.153803 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:50.153813 | orchestrator | 2025-09-23 19:15:50.153824 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:15:50.153835 | orchestrator | Tuesday 23 September 2025 19:13:35 +0000 (0:00:00.423) 0:00:03.593 ***** 2025-09-23 19:15:50.153846 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2025-09-23 19:15:50.153857 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2025-09-23 19:15:50.153868 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2025-09-23 19:15:50.153878 | orchestrator | 2025-09-23 19:15:50.153890 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2025-09-23 19:15:50.153901 | orchestrator | 2025-09-23 19:15:50.153911 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-09-23 19:15:50.153922 | orchestrator | Tuesday 23 September 2025 19:13:36 +0000 (0:00:00.609) 0:00:04.203 ***** 2025-09-23 19:15:50.153933 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:15:50.153944 | orchestrator | 2025-09-23 19:15:50.153955 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-09-23 19:15:50.153965 | orchestrator | Tuesday 23 September 2025 19:13:36 +0000 (0:00:00.480) 0:00:04.683 ***** 2025-09-23 19:15:50.153989 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:50.154000 | orchestrator | 2025-09-23 19:15:50.154012 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2025-09-23 19:15:50.154069 | orchestrator | Tuesday 23 September 2025 19:13:37 +0000 (0:00:01.000) 0:00:05.683 ***** 2025-09-23 19:15:50.154082 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.154094 | orchestrator | 2025-09-23 19:15:50.154106 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2025-09-23 19:15:50.154120 | orchestrator | Tuesday 23 September 2025 19:13:38 +0000 (0:00:00.372) 0:00:06.056 ***** 2025-09-23 19:15:50.154132 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.154144 | orchestrator | 2025-09-23 19:15:50.154156 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2025-09-23 19:15:50.154169 | orchestrator | Tuesday 23 September 2025 19:13:38 +0000 (0:00:00.387) 0:00:06.443 ***** 2025-09-23 19:15:50.154181 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.154193 | orchestrator | 2025-09-23 19:15:50.154205 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2025-09-23 19:15:50.154218 | orchestrator | Tuesday 23 September 2025 19:13:39 +0000 (0:00:00.398) 0:00:06.841 ***** 2025-09-23 19:15:50.154231 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.154243 | orchestrator | 2025-09-23 19:15:50.154255 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-09-23 19:15:50.154267 | orchestrator | Tuesday 23 September 2025 19:13:39 +0000 (0:00:00.321) 0:00:07.163 ***** 2025-09-23 19:15:50.154280 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:15:50.154292 | orchestrator | 2025-09-23 19:15:50.154305 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-09-23 19:15:50.154317 | orchestrator | Tuesday 23 September 2025 19:13:40 +0000 (0:00:00.936) 0:00:08.100 ***** 2025-09-23 19:15:50.154329 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:50.154341 | orchestrator | 2025-09-23 19:15:50.154353 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2025-09-23 19:15:50.154366 | orchestrator | Tuesday 23 September 2025 19:13:41 +0000 (0:00:00.854) 0:00:08.954 ***** 2025-09-23 19:15:50.154377 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.154388 | orchestrator | 2025-09-23 19:15:50.154399 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2025-09-23 19:15:50.154410 | orchestrator | Tuesday 23 September 2025 19:13:41 +0000 (0:00:00.406) 0:00:09.360 ***** 2025-09-23 19:15:50.154420 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.154431 | orchestrator | 2025-09-23 19:15:50.154452 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2025-09-23 19:15:50.154464 | orchestrator | Tuesday 23 September 2025 19:13:41 +0000 (0:00:00.338) 0:00:09.699 ***** 2025-09-23 19:15:50.154485 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.154511 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.154524 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.154536 | orchestrator | 2025-09-23 19:15:50.154547 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2025-09-23 19:15:50.154558 | orchestrator | Tuesday 23 September 2025 19:13:42 +0000 (0:00:00.909) 0:00:10.609 ***** 2025-09-23 19:15:50.154579 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.154597 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.154616 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.154646 | orchestrator | 2025-09-23 19:15:50.154657 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2025-09-23 19:15:50.154668 | orchestrator | Tuesday 23 September 2025 19:13:44 +0000 (0:00:01.679) 0:00:12.288 ***** 2025-09-23 19:15:50.154679 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-09-23 19:15:50.154690 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-09-23 19:15:50.154701 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-09-23 19:15:50.154711 | orchestrator | 2025-09-23 19:15:50.154722 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2025-09-23 19:15:50.154733 | orchestrator | Tuesday 23 September 2025 19:13:46 +0000 (0:00:01.467) 0:00:13.755 ***** 2025-09-23 19:15:50.154744 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-09-23 19:15:50.154754 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-09-23 19:15:50.154765 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-09-23 19:15:50.154775 | orchestrator | 2025-09-23 19:15:50.154786 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2025-09-23 19:15:50.154797 | orchestrator | Tuesday 23 September 2025 19:13:49 +0000 (0:00:03.658) 0:00:17.414 ***** 2025-09-23 19:15:50.154807 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-09-23 19:15:50.154818 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-09-23 19:15:50.154829 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-09-23 19:15:50.154839 | orchestrator | 2025-09-23 19:15:50.154850 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2025-09-23 19:15:50.154861 | orchestrator | Tuesday 23 September 2025 19:13:52 +0000 (0:00:02.351) 0:00:19.766 ***** 2025-09-23 19:15:50.154878 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-09-23 19:15:50.154889 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-09-23 19:15:50.154900 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-09-23 19:15:50.154917 | orchestrator | 2025-09-23 19:15:50.154928 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2025-09-23 19:15:50.154939 | orchestrator | Tuesday 23 September 2025 19:13:53 +0000 (0:00:01.623) 0:00:21.389 ***** 2025-09-23 19:15:50.154954 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-09-23 19:15:50.154965 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-09-23 19:15:50.154976 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-09-23 19:15:50.154987 | orchestrator | 2025-09-23 19:15:50.154998 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2025-09-23 19:15:50.155008 | orchestrator | Tuesday 23 September 2025 19:13:55 +0000 (0:00:01.601) 0:00:22.991 ***** 2025-09-23 19:15:50.155019 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-09-23 19:15:50.155030 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-09-23 19:15:50.155041 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-09-23 19:15:50.155051 | orchestrator | 2025-09-23 19:15:50.155062 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-09-23 19:15:50.155073 | orchestrator | Tuesday 23 September 2025 19:13:56 +0000 (0:00:01.430) 0:00:24.421 ***** 2025-09-23 19:15:50.155084 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.155095 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:50.155105 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:50.155116 | orchestrator | 2025-09-23 19:15:50.155127 | orchestrator | TASK [rabbitmq : Check rabbitmq containers] ************************************ 2025-09-23 19:15:50.155138 | orchestrator | Tuesday 23 September 2025 19:13:57 +0000 (0:00:00.521) 0:00:24.943 ***** 2025-09-23 19:15:50.155150 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.155162 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.155193 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:15:50.155205 | orchestrator | 2025-09-23 19:15:50.155216 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2025-09-23 19:15:50.155227 | orchestrator | Tuesday 23 September 2025 19:13:58 +0000 (0:00:01.605) 0:00:26.548 ***** 2025-09-23 19:15:50.155238 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:50.155249 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:50.155259 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:50.155270 | orchestrator | 2025-09-23 19:15:50.155281 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2025-09-23 19:15:50.155292 | orchestrator | Tuesday 23 September 2025 19:13:59 +0000 (0:00:01.023) 0:00:27.572 ***** 2025-09-23 19:15:50.155302 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:50.155313 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:50.155324 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:50.155334 | orchestrator | 2025-09-23 19:15:50.155345 | orchestrator | RUNNING HANDLER [rabbitmq : Restart rabbitmq container] ************************ 2025-09-23 19:15:50.155356 | orchestrator | Tuesday 23 September 2025 19:14:06 +0000 (0:00:06.645) 0:00:34.218 ***** 2025-09-23 19:15:50.155367 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:50.155378 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:50.155388 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:50.155399 | orchestrator | 2025-09-23 19:15:50.155410 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-09-23 19:15:50.155421 | orchestrator | 2025-09-23 19:15:50.155431 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-09-23 19:15:50.155442 | orchestrator | Tuesday 23 September 2025 19:14:07 +0000 (0:00:00.893) 0:00:35.112 ***** 2025-09-23 19:15:50.155453 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:50.155464 | orchestrator | 2025-09-23 19:15:50.155474 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-09-23 19:15:50.155485 | orchestrator | Tuesday 23 September 2025 19:14:08 +0000 (0:00:00.784) 0:00:35.896 ***** 2025-09-23 19:15:50.155496 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:50.155507 | orchestrator | 2025-09-23 19:15:50.155517 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-09-23 19:15:50.155528 | orchestrator | Tuesday 23 September 2025 19:14:08 +0000 (0:00:00.490) 0:00:36.386 ***** 2025-09-23 19:15:50.155539 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:50.155550 | orchestrator | 2025-09-23 19:15:50.155561 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-09-23 19:15:50.155572 | orchestrator | Tuesday 23 September 2025 19:14:10 +0000 (0:00:01.734) 0:00:38.121 ***** 2025-09-23 19:15:50.155583 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:50.155593 | orchestrator | 2025-09-23 19:15:50.155604 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-09-23 19:15:50.155615 | orchestrator | 2025-09-23 19:15:50.155626 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-09-23 19:15:50.155658 | orchestrator | Tuesday 23 September 2025 19:15:06 +0000 (0:00:55.683) 0:01:33.804 ***** 2025-09-23 19:15:50.155669 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:50.155680 | orchestrator | 2025-09-23 19:15:50.155691 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-09-23 19:15:50.155701 | orchestrator | Tuesday 23 September 2025 19:15:06 +0000 (0:00:00.595) 0:01:34.400 ***** 2025-09-23 19:15:50.155712 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:50.155723 | orchestrator | 2025-09-23 19:15:50.155733 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-09-23 19:15:50.155744 | orchestrator | Tuesday 23 September 2025 19:15:07 +0000 (0:00:00.307) 0:01:34.708 ***** 2025-09-23 19:15:50.155755 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:50.155765 | orchestrator | 2025-09-23 19:15:50.155776 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-09-23 19:15:50.155787 | orchestrator | Tuesday 23 September 2025 19:15:08 +0000 (0:00:01.475) 0:01:36.183 ***** 2025-09-23 19:15:50.155797 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:50.155808 | orchestrator | 2025-09-23 19:15:50.155818 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-09-23 19:15:50.155829 | orchestrator | 2025-09-23 19:15:50.155840 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-09-23 19:15:50.155850 | orchestrator | Tuesday 23 September 2025 19:15:23 +0000 (0:00:15.281) 0:01:51.464 ***** 2025-09-23 19:15:50.155861 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:50.155872 | orchestrator | 2025-09-23 19:15:50.155882 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-09-23 19:15:50.155893 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.583) 0:01:52.048 ***** 2025-09-23 19:15:50.155904 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:50.155914 | orchestrator | 2025-09-23 19:15:50.155925 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-09-23 19:15:50.155935 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.398) 0:01:52.446 ***** 2025-09-23 19:15:50.155946 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:50.155957 | orchestrator | 2025-09-23 19:15:50.155968 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-09-23 19:15:50.155985 | orchestrator | Tuesday 23 September 2025 19:15:31 +0000 (0:00:06.704) 0:01:59.151 ***** 2025-09-23 19:15:50.155996 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:50.156007 | orchestrator | 2025-09-23 19:15:50.156017 | orchestrator | PLAY [Apply rabbitmq post-configuration] *************************************** 2025-09-23 19:15:50.156028 | orchestrator | 2025-09-23 19:15:50.156039 | orchestrator | TASK [Include rabbitmq post-deploy.yml] **************************************** 2025-09-23 19:15:50.156049 | orchestrator | Tuesday 23 September 2025 19:15:43 +0000 (0:00:12.139) 0:02:11.290 ***** 2025-09-23 19:15:50.156060 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:15:50.156071 | orchestrator | 2025-09-23 19:15:50.156081 | orchestrator | TASK [rabbitmq : Enable all stable feature flags] ****************************** 2025-09-23 19:15:50.156096 | orchestrator | Tuesday 23 September 2025 19:15:44 +0000 (0:00:00.836) 0:02:12.126 ***** 2025-09-23 19:15:50.156108 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-09-23 19:15:50.156118 | orchestrator | enable_outward_rabbitmq_True 2025-09-23 19:15:50.156129 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-09-23 19:15:50.156140 | orchestrator | outward_rabbitmq_restart 2025-09-23 19:15:50.156150 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:50.156161 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:50.156172 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:50.156182 | orchestrator | 2025-09-23 19:15:50.156193 | orchestrator | PLAY [Apply role rabbitmq (outward)] ******************************************* 2025-09-23 19:15:50.156204 | orchestrator | skipping: no hosts matched 2025-09-23 19:15:50.156214 | orchestrator | 2025-09-23 19:15:50.156231 | orchestrator | PLAY [Restart rabbitmq (outward) services] ************************************* 2025-09-23 19:15:50.156242 | orchestrator | skipping: no hosts matched 2025-09-23 19:15:50.156252 | orchestrator | 2025-09-23 19:15:50.156263 | orchestrator | PLAY [Apply rabbitmq (outward) post-configuration] ***************************** 2025-09-23 19:15:50.156274 | orchestrator | skipping: no hosts matched 2025-09-23 19:15:50.156285 | orchestrator | 2025-09-23 19:15:50.156296 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:15:50.156306 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2025-09-23 19:15:50.156318 | orchestrator | testbed-node-0 : ok=23  changed=14  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-09-23 19:15:50.156329 | orchestrator | testbed-node-1 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:15:50.156340 | orchestrator | testbed-node-2 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:15:50.156351 | orchestrator | 2025-09-23 19:15:50.156361 | orchestrator | 2025-09-23 19:15:50.156372 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:15:50.156383 | orchestrator | Tuesday 23 September 2025 19:15:47 +0000 (0:00:02.659) 0:02:14.786 ***** 2025-09-23 19:15:50.156394 | orchestrator | =============================================================================== 2025-09-23 19:15:50.156405 | orchestrator | rabbitmq : Waiting for rabbitmq to start ------------------------------- 83.10s 2025-09-23 19:15:50.156415 | orchestrator | rabbitmq : Restart rabbitmq container ----------------------------------- 9.91s 2025-09-23 19:15:50.156426 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 6.65s 2025-09-23 19:15:50.156437 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 3.66s 2025-09-23 19:15:50.156447 | orchestrator | Check RabbitMQ service -------------------------------------------------- 2.81s 2025-09-23 19:15:50.156458 | orchestrator | rabbitmq : Enable all stable feature flags ------------------------------ 2.66s 2025-09-23 19:15:50.156469 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 2.35s 2025-09-23 19:15:50.156480 | orchestrator | rabbitmq : Get info on RabbitMQ container ------------------------------- 1.96s 2025-09-23 19:15:50.156490 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 1.68s 2025-09-23 19:15:50.156501 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 1.62s 2025-09-23 19:15:50.156512 | orchestrator | rabbitmq : Check rabbitmq containers ------------------------------------ 1.61s 2025-09-23 19:15:50.156523 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 1.60s 2025-09-23 19:15:50.156534 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 1.47s 2025-09-23 19:15:50.156544 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 1.43s 2025-09-23 19:15:50.156555 | orchestrator | rabbitmq : Put RabbitMQ node into maintenance mode ---------------------- 1.20s 2025-09-23 19:15:50.156566 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 1.02s 2025-09-23 19:15:50.156577 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 1.00s 2025-09-23 19:15:50.156588 | orchestrator | rabbitmq : include_tasks ------------------------------------------------ 0.94s 2025-09-23 19:15:50.156598 | orchestrator | rabbitmq : Ensuring config directories exist ---------------------------- 0.91s 2025-09-23 19:15:50.156609 | orchestrator | rabbitmq : Restart rabbitmq container ----------------------------------- 0.89s 2025-09-23 19:15:50.156620 | orchestrator | 2025-09-23 19:15:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:53.231700 | orchestrator | 2025-09-23 19:15:53 | INFO  | Task 87fad0cd-c392-4aeb-9be7-0792a9af9b74 is in state STARTED 2025-09-23 19:15:53.231826 | orchestrator | 2025-09-23 19:15:53 | INFO  | Task 840b5689-b177-42e5-841c-ef7d745de16e is in state STARTED 2025-09-23 19:15:53.234884 | orchestrator | 2025-09-23 19:15:53 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:53.235395 | orchestrator | 2025-09-23 19:15:53 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:53.236480 | orchestrator | 2025-09-23 19:15:53 | INFO  | Task 4640c0ea-ac5c-4fdb-bc21-57b435fb08da is in state SUCCESS 2025-09-23 19:15:53.238496 | orchestrator | 2025-09-23 19:15:53.238535 | orchestrator | 2025-09-23 19:15:53.238547 | orchestrator | PLAY [Prepare all k3s nodes] *************************************************** 2025-09-23 19:15:53.238558 | orchestrator | 2025-09-23 19:15:53.238577 | orchestrator | TASK [k3s_prereq : Validating arguments against arg spec 'main' - Prerequisites] *** 2025-09-23 19:15:53.238589 | orchestrator | Tuesday 23 September 2025 19:12:14 +0000 (0:00:00.201) 0:00:00.201 ***** 2025-09-23 19:15:53.238600 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.238611 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.238641 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.238653 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.238663 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.238674 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.238685 | orchestrator | 2025-09-23 19:15:53.238696 | orchestrator | TASK [k3s_prereq : Set same timezone on every Server] ************************** 2025-09-23 19:15:53.238707 | orchestrator | Tuesday 23 September 2025 19:12:15 +0000 (0:00:00.744) 0:00:00.946 ***** 2025-09-23 19:15:53.238717 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.238729 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.238740 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.238751 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.238761 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.238772 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.238782 | orchestrator | 2025-09-23 19:15:53.238793 | orchestrator | TASK [k3s_prereq : Set SELinux to disabled state] ****************************** 2025-09-23 19:15:53.238804 | orchestrator | Tuesday 23 September 2025 19:12:15 +0000 (0:00:00.602) 0:00:01.548 ***** 2025-09-23 19:15:53.238815 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.238825 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.238836 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.238846 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.238857 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.238868 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.238878 | orchestrator | 2025-09-23 19:15:53.238889 | orchestrator | TASK [k3s_prereq : Enable IPv4 forwarding] ************************************* 2025-09-23 19:15:53.238900 | orchestrator | Tuesday 23 September 2025 19:12:16 +0000 (0:00:00.543) 0:00:02.091 ***** 2025-09-23 19:15:53.238910 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.238921 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.238931 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.238942 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.238952 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.238963 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.238974 | orchestrator | 2025-09-23 19:15:53.238984 | orchestrator | TASK [k3s_prereq : Enable IPv6 forwarding] ************************************* 2025-09-23 19:15:53.238995 | orchestrator | Tuesday 23 September 2025 19:12:18 +0000 (0:00:01.800) 0:00:03.892 ***** 2025-09-23 19:15:53.239006 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.239016 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.239027 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.239037 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.239048 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.239058 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.239069 | orchestrator | 2025-09-23 19:15:53.239080 | orchestrator | TASK [k3s_prereq : Enable IPv6 router advertisements] ************************** 2025-09-23 19:15:53.239107 | orchestrator | Tuesday 23 September 2025 19:12:19 +0000 (0:00:01.097) 0:00:04.989 ***** 2025-09-23 19:15:53.239118 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.239130 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.239142 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.239154 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.239166 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.239178 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.239190 | orchestrator | 2025-09-23 19:15:53.239202 | orchestrator | TASK [k3s_prereq : Add br_netfilter to /etc/modules-load.d/] ******************* 2025-09-23 19:15:53.239214 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:01.157) 0:00:06.147 ***** 2025-09-23 19:15:53.239226 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.239238 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.239249 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.239261 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.239273 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.239285 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.239296 | orchestrator | 2025-09-23 19:15:53.239309 | orchestrator | TASK [k3s_prereq : Load br_netfilter] ****************************************** 2025-09-23 19:15:53.239321 | orchestrator | Tuesday 23 September 2025 19:12:21 +0000 (0:00:00.507) 0:00:06.654 ***** 2025-09-23 19:15:53.239333 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.239346 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.239358 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.239369 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.239381 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.239393 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.239405 | orchestrator | 2025-09-23 19:15:53.239417 | orchestrator | TASK [k3s_prereq : Set bridge-nf-call-iptables (just to be sure)] ************** 2025-09-23 19:15:53.239429 | orchestrator | Tuesday 23 September 2025 19:12:21 +0000 (0:00:00.759) 0:00:07.414 ***** 2025-09-23 19:15:53.239441 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-23 19:15:53.239451 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-23 19:15:53.239462 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.239473 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-23 19:15:53.239483 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-23 19:15:53.239494 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-23 19:15:53.239504 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-23 19:15:53.239515 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.239526 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-23 19:15:53.239537 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-23 19:15:53.239558 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-23 19:15:53.239574 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-23 19:15:53.239585 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.239596 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.239607 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.239618 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-23 19:15:53.239654 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-23 19:15:53.239665 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.239676 | orchestrator | 2025-09-23 19:15:53.239687 | orchestrator | TASK [k3s_prereq : Add /usr/local/bin to sudo secure_path] ********************* 2025-09-23 19:15:53.239697 | orchestrator | Tuesday 23 September 2025 19:12:23 +0000 (0:00:01.231) 0:00:08.645 ***** 2025-09-23 19:15:53.239714 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.239725 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.239736 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.239746 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.239757 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.239768 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.239778 | orchestrator | 2025-09-23 19:15:53.239789 | orchestrator | TASK [k3s_download : Validating arguments against arg spec 'main' - Manage the downloading of K3S binaries] *** 2025-09-23 19:15:53.239800 | orchestrator | Tuesday 23 September 2025 19:12:24 +0000 (0:00:01.025) 0:00:09.670 ***** 2025-09-23 19:15:53.239811 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.239822 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.239833 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.239844 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.239854 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.239865 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.239875 | orchestrator | 2025-09-23 19:15:53.239886 | orchestrator | TASK [k3s_download : Download k3s binary x64] ********************************** 2025-09-23 19:15:53.239897 | orchestrator | Tuesday 23 September 2025 19:12:24 +0000 (0:00:00.825) 0:00:10.495 ***** 2025-09-23 19:15:53.239908 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.239919 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.239929 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.239940 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.239950 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.239961 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.239972 | orchestrator | 2025-09-23 19:15:53.239983 | orchestrator | TASK [k3s_download : Download k3s binary arm64] ******************************** 2025-09-23 19:15:53.239993 | orchestrator | Tuesday 23 September 2025 19:12:30 +0000 (0:00:05.547) 0:00:16.042 ***** 2025-09-23 19:15:53.240004 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.240014 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.240025 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.240036 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.240047 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.240057 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.240068 | orchestrator | 2025-09-23 19:15:53.240078 | orchestrator | TASK [k3s_download : Download k3s binary armhf] ******************************** 2025-09-23 19:15:53.240089 | orchestrator | Tuesday 23 September 2025 19:12:32 +0000 (0:00:01.803) 0:00:17.846 ***** 2025-09-23 19:15:53.240100 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.240110 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.240121 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.240132 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.240142 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.240153 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.240163 | orchestrator | 2025-09-23 19:15:53.240174 | orchestrator | TASK [k3s_custom_registries : Validating arguments against arg spec 'main' - Configure the use of a custom container registry] *** 2025-09-23 19:15:53.240186 | orchestrator | Tuesday 23 September 2025 19:12:35 +0000 (0:00:03.005) 0:00:20.851 ***** 2025-09-23 19:15:53.240197 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.240208 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.240218 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.240229 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.240240 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.240250 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.240261 | orchestrator | 2025-09-23 19:15:53.240272 | orchestrator | TASK [k3s_custom_registries : Create directory /etc/rancher/k3s] *************** 2025-09-23 19:15:53.240283 | orchestrator | Tuesday 23 September 2025 19:12:36 +0000 (0:00:01.292) 0:00:22.144 ***** 2025-09-23 19:15:53.240293 | orchestrator | changed: [testbed-node-3] => (item=rancher) 2025-09-23 19:15:53.240304 | orchestrator | changed: [testbed-node-4] => (item=rancher) 2025-09-23 19:15:53.240320 | orchestrator | changed: [testbed-node-5] => (item=rancher) 2025-09-23 19:15:53.240331 | orchestrator | changed: [testbed-node-3] => (item=rancher/k3s) 2025-09-23 19:15:53.240342 | orchestrator | changed: [testbed-node-4] => (item=rancher/k3s) 2025-09-23 19:15:53.240352 | orchestrator | changed: [testbed-node-5] => (item=rancher/k3s) 2025-09-23 19:15:53.240363 | orchestrator | changed: [testbed-node-0] => (item=rancher) 2025-09-23 19:15:53.240373 | orchestrator | changed: [testbed-node-1] => (item=rancher) 2025-09-23 19:15:53.240384 | orchestrator | changed: [testbed-node-2] => (item=rancher) 2025-09-23 19:15:53.240394 | orchestrator | changed: [testbed-node-1] => (item=rancher/k3s) 2025-09-23 19:15:53.240405 | orchestrator | changed: [testbed-node-0] => (item=rancher/k3s) 2025-09-23 19:15:53.240415 | orchestrator | changed: [testbed-node-2] => (item=rancher/k3s) 2025-09-23 19:15:53.240426 | orchestrator | 2025-09-23 19:15:53.240437 | orchestrator | TASK [k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml] *** 2025-09-23 19:15:53.240448 | orchestrator | Tuesday 23 September 2025 19:12:39 +0000 (0:00:02.498) 0:00:24.643 ***** 2025-09-23 19:15:53.240458 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.240469 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.240479 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.240490 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.240501 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.240511 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.240522 | orchestrator | 2025-09-23 19:15:53.240538 | orchestrator | PLAY [Deploy k3s master nodes] ************************************************* 2025-09-23 19:15:53.240549 | orchestrator | 2025-09-23 19:15:53.240560 | orchestrator | TASK [k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers] *** 2025-09-23 19:15:53.240570 | orchestrator | Tuesday 23 September 2025 19:12:40 +0000 (0:00:01.639) 0:00:26.282 ***** 2025-09-23 19:15:53.240581 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.240592 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.240602 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.240613 | orchestrator | 2025-09-23 19:15:53.240676 | orchestrator | TASK [k3s_server : Stop k3s-init] ********************************************** 2025-09-23 19:15:53.240689 | orchestrator | Tuesday 23 September 2025 19:12:42 +0000 (0:00:01.439) 0:00:27.722 ***** 2025-09-23 19:15:53.240700 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.240711 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.240722 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.240732 | orchestrator | 2025-09-23 19:15:53.240743 | orchestrator | TASK [k3s_server : Stop k3s] *************************************************** 2025-09-23 19:15:53.240754 | orchestrator | Tuesday 23 September 2025 19:12:43 +0000 (0:00:01.392) 0:00:29.114 ***** 2025-09-23 19:15:53.240765 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.240776 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.240786 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.240797 | orchestrator | 2025-09-23 19:15:53.240808 | orchestrator | TASK [k3s_server : Clean previous runs of k3s-init] **************************** 2025-09-23 19:15:53.240818 | orchestrator | Tuesday 23 September 2025 19:12:44 +0000 (0:00:01.037) 0:00:30.152 ***** 2025-09-23 19:15:53.240829 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.240840 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.240851 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.240861 | orchestrator | 2025-09-23 19:15:53.240872 | orchestrator | TASK [k3s_server : Deploy K3s http_proxy conf] ********************************* 2025-09-23 19:15:53.240883 | orchestrator | Tuesday 23 September 2025 19:12:45 +0000 (0:00:01.358) 0:00:31.512 ***** 2025-09-23 19:15:53.240893 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.240904 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.240915 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.240926 | orchestrator | 2025-09-23 19:15:53.240937 | orchestrator | TASK [k3s_server : Create /etc/rancher/k3s directory] ************************** 2025-09-23 19:15:53.240947 | orchestrator | Tuesday 23 September 2025 19:12:46 +0000 (0:00:00.478) 0:00:31.991 ***** 2025-09-23 19:15:53.240965 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.240975 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.240986 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.240997 | orchestrator | 2025-09-23 19:15:53.241008 | orchestrator | TASK [k3s_server : Create custom resolv.conf for k3s] ************************** 2025-09-23 19:15:53.241019 | orchestrator | Tuesday 23 September 2025 19:12:47 +0000 (0:00:01.012) 0:00:33.003 ***** 2025-09-23 19:15:53.241029 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.241040 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.241051 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.241062 | orchestrator | 2025-09-23 19:15:53.241073 | orchestrator | TASK [k3s_server : Deploy vip manifest] **************************************** 2025-09-23 19:15:53.241083 | orchestrator | Tuesday 23 September 2025 19:12:48 +0000 (0:00:01.530) 0:00:34.534 ***** 2025-09-23 19:15:53.241094 | orchestrator | included: /ansible/roles/k3s_server/tasks/vip.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:15:53.241105 | orchestrator | 2025-09-23 19:15:53.241116 | orchestrator | TASK [k3s_server : Set _kube_vip_bgp_peers fact] ******************************* 2025-09-23 19:15:53.241127 | orchestrator | Tuesday 23 September 2025 19:12:49 +0000 (0:00:00.675) 0:00:35.210 ***** 2025-09-23 19:15:53.241137 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.241148 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.241159 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.241170 | orchestrator | 2025-09-23 19:15:53.241180 | orchestrator | TASK [k3s_server : Create manifests directory on first master] ***************** 2025-09-23 19:15:53.241191 | orchestrator | Tuesday 23 September 2025 19:12:52 +0000 (0:00:02.744) 0:00:37.954 ***** 2025-09-23 19:15:53.241200 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.241210 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.241219 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.241229 | orchestrator | 2025-09-23 19:15:53.241239 | orchestrator | TASK [k3s_server : Download vip rbac manifest to first master] ***************** 2025-09-23 19:15:53.241248 | orchestrator | Tuesday 23 September 2025 19:12:53 +0000 (0:00:00.725) 0:00:38.679 ***** 2025-09-23 19:15:53.241258 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.241267 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.241277 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.241286 | orchestrator | 2025-09-23 19:15:53.241296 | orchestrator | TASK [k3s_server : Copy vip manifest to first master] ************************** 2025-09-23 19:15:53.241305 | orchestrator | Tuesday 23 September 2025 19:12:54 +0000 (0:00:01.074) 0:00:39.754 ***** 2025-09-23 19:15:53.241315 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.241324 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.241334 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.241343 | orchestrator | 2025-09-23 19:15:53.241353 | orchestrator | TASK [k3s_server : Deploy metallb manifest] ************************************ 2025-09-23 19:15:53.241362 | orchestrator | Tuesday 23 September 2025 19:12:55 +0000 (0:00:01.633) 0:00:41.387 ***** 2025-09-23 19:15:53.241372 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.241381 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.241391 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.241400 | orchestrator | 2025-09-23 19:15:53.241410 | orchestrator | TASK [k3s_server : Deploy kube-vip manifest] *********************************** 2025-09-23 19:15:53.241419 | orchestrator | Tuesday 23 September 2025 19:12:56 +0000 (0:00:00.753) 0:00:42.141 ***** 2025-09-23 19:15:53.241429 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.241438 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.241448 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.241457 | orchestrator | 2025-09-23 19:15:53.241467 | orchestrator | TASK [k3s_server : Init cluster inside the transient k3s-init service] ********* 2025-09-23 19:15:53.241476 | orchestrator | Tuesday 23 September 2025 19:12:57 +0000 (0:00:00.634) 0:00:42.776 ***** 2025-09-23 19:15:53.241486 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.241495 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.241509 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.241519 | orchestrator | 2025-09-23 19:15:53.242117 | orchestrator | TASK [k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails)] *** 2025-09-23 19:15:53.242138 | orchestrator | Tuesday 23 September 2025 19:12:59 +0000 (0:00:02.193) 0:00:44.970 ***** 2025-09-23 19:15:53.242148 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2025-09-23 19:15:53.242159 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2025-09-23 19:15:53.242169 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2025-09-23 19:15:53.242179 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2025-09-23 19:15:53.242189 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2025-09-23 19:15:53.242198 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2025-09-23 19:15:53.242208 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2025-09-23 19:15:53.242218 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2025-09-23 19:15:53.242227 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2025-09-23 19:15:53.242237 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2025-09-23 19:15:53.242246 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2025-09-23 19:15:53.242256 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2025-09-23 19:15:53.242266 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (16 retries left). 2025-09-23 19:15:53.242276 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.242285 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.242295 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.242305 | orchestrator | 2025-09-23 19:15:53.242315 | orchestrator | TASK [k3s_server : Save logs of k3s-init.service] ****************************** 2025-09-23 19:15:53.242325 | orchestrator | Tuesday 23 September 2025 19:13:54 +0000 (0:00:55.091) 0:01:40.061 ***** 2025-09-23 19:15:53.242334 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.242344 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.242353 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.242363 | orchestrator | 2025-09-23 19:15:53.242372 | orchestrator | TASK [k3s_server : Kill the temporary service used for initialization] ********* 2025-09-23 19:15:53.242382 | orchestrator | Tuesday 23 September 2025 19:13:54 +0000 (0:00:00.293) 0:01:40.354 ***** 2025-09-23 19:15:53.242391 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.242401 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.242411 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.242420 | orchestrator | 2025-09-23 19:15:53.242430 | orchestrator | TASK [k3s_server : Copy K3s service file] ************************************** 2025-09-23 19:15:53.242439 | orchestrator | Tuesday 23 September 2025 19:13:55 +0000 (0:00:00.870) 0:01:41.225 ***** 2025-09-23 19:15:53.242449 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.242459 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.242468 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.242485 | orchestrator | 2025-09-23 19:15:53.242494 | orchestrator | TASK [k3s_server : Enable and check K3s service] ******************************* 2025-09-23 19:15:53.242504 | orchestrator | Tuesday 23 September 2025 19:13:56 +0000 (0:00:01.144) 0:01:42.369 ***** 2025-09-23 19:15:53.242514 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.242523 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.242533 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.242542 | orchestrator | 2025-09-23 19:15:53.242552 | orchestrator | TASK [k3s_server : Wait for node-token] **************************************** 2025-09-23 19:15:53.242562 | orchestrator | Tuesday 23 September 2025 19:14:21 +0000 (0:00:24.775) 0:02:07.145 ***** 2025-09-23 19:15:53.242571 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.242581 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.242590 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.242600 | orchestrator | 2025-09-23 19:15:53.242610 | orchestrator | TASK [k3s_server : Register node-token file access mode] *********************** 2025-09-23 19:15:53.242637 | orchestrator | Tuesday 23 September 2025 19:14:22 +0000 (0:00:00.646) 0:02:07.792 ***** 2025-09-23 19:15:53.242647 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.242657 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.242667 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.242676 | orchestrator | 2025-09-23 19:15:53.242686 | orchestrator | TASK [k3s_server : Change file access node-token] ****************************** 2025-09-23 19:15:53.242695 | orchestrator | Tuesday 23 September 2025 19:14:22 +0000 (0:00:00.615) 0:02:08.407 ***** 2025-09-23 19:15:53.242705 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.242715 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.242724 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.242734 | orchestrator | 2025-09-23 19:15:53.242751 | orchestrator | TASK [k3s_server : Read node-token from master] ******************************** 2025-09-23 19:15:53.242761 | orchestrator | Tuesday 23 September 2025 19:14:23 +0000 (0:00:00.608) 0:02:09.016 ***** 2025-09-23 19:15:53.242771 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.242781 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.242790 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.242800 | orchestrator | 2025-09-23 19:15:53.242810 | orchestrator | TASK [k3s_server : Store Master node-token] ************************************ 2025-09-23 19:15:53.242819 | orchestrator | Tuesday 23 September 2025 19:14:24 +0000 (0:00:00.876) 0:02:09.892 ***** 2025-09-23 19:15:53.242829 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.242838 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.242848 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.242857 | orchestrator | 2025-09-23 19:15:53.242867 | orchestrator | TASK [k3s_server : Restore node-token file access] ***************************** 2025-09-23 19:15:53.242877 | orchestrator | Tuesday 23 September 2025 19:14:24 +0000 (0:00:00.298) 0:02:10.190 ***** 2025-09-23 19:15:53.242887 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.242896 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.242906 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.242915 | orchestrator | 2025-09-23 19:15:53.242925 | orchestrator | TASK [k3s_server : Create directory .kube] ************************************* 2025-09-23 19:15:53.242935 | orchestrator | Tuesday 23 September 2025 19:14:25 +0000 (0:00:00.687) 0:02:10.878 ***** 2025-09-23 19:15:53.242944 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.242954 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.242963 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.242973 | orchestrator | 2025-09-23 19:15:53.242982 | orchestrator | TASK [k3s_server : Copy config file to user home directory] ******************** 2025-09-23 19:15:53.242992 | orchestrator | Tuesday 23 September 2025 19:14:25 +0000 (0:00:00.729) 0:02:11.608 ***** 2025-09-23 19:15:53.243001 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.243011 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.243021 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.243030 | orchestrator | 2025-09-23 19:15:53.243040 | orchestrator | TASK [k3s_server : Configure kubectl cluster to https://192.168.16.8:6443] ***** 2025-09-23 19:15:53.243058 | orchestrator | Tuesday 23 September 2025 19:14:27 +0000 (0:00:01.238) 0:02:12.847 ***** 2025-09-23 19:15:53.243068 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:15:53.243078 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:15:53.243087 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:15:53.243097 | orchestrator | 2025-09-23 19:15:53.243106 | orchestrator | TASK [k3s_server : Create kubectl symlink] ************************************* 2025-09-23 19:15:53.243116 | orchestrator | Tuesday 23 September 2025 19:14:28 +0000 (0:00:01.008) 0:02:13.855 ***** 2025-09-23 19:15:53.243125 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.243135 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.243144 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.243154 | orchestrator | 2025-09-23 19:15:53.243163 | orchestrator | TASK [k3s_server : Create crictl symlink] ************************************** 2025-09-23 19:15:53.243173 | orchestrator | Tuesday 23 September 2025 19:14:28 +0000 (0:00:00.260) 0:02:14.115 ***** 2025-09-23 19:15:53.243183 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.243192 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.243202 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.243211 | orchestrator | 2025-09-23 19:15:53.243221 | orchestrator | TASK [k3s_server : Get contents of manifests folder] *************************** 2025-09-23 19:15:53.243231 | orchestrator | Tuesday 23 September 2025 19:14:28 +0000 (0:00:00.256) 0:02:14.372 ***** 2025-09-23 19:15:53.243240 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.243250 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.243259 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.243269 | orchestrator | 2025-09-23 19:15:53.243278 | orchestrator | TASK [k3s_server : Get sub dirs of manifests folder] *************************** 2025-09-23 19:15:53.243288 | orchestrator | Tuesday 23 September 2025 19:14:29 +0000 (0:00:00.888) 0:02:15.261 ***** 2025-09-23 19:15:53.243298 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.243307 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.243317 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.243326 | orchestrator | 2025-09-23 19:15:53.243336 | orchestrator | TASK [k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start] *** 2025-09-23 19:15:53.243345 | orchestrator | Tuesday 23 September 2025 19:14:30 +0000 (0:00:00.650) 0:02:15.911 ***** 2025-09-23 19:15:53.243355 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2025-09-23 19:15:53.243365 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2025-09-23 19:15:53.243375 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2025-09-23 19:15:53.243384 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2025-09-23 19:15:53.243394 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2025-09-23 19:15:53.243404 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2025-09-23 19:15:53.243413 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2025-09-23 19:15:53.243426 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2025-09-23 19:15:53.243436 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2025-09-23 19:15:53.243446 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip.yaml) 2025-09-23 19:15:53.243456 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2025-09-23 19:15:53.243465 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2025-09-23 19:15:53.243479 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip-rbac.yaml) 2025-09-23 19:15:53.243494 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2025-09-23 19:15:53.243504 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2025-09-23 19:15:53.243513 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2025-09-23 19:15:53.243523 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2025-09-23 19:15:53.243532 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2025-09-23 19:15:53.243542 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2025-09-23 19:15:53.243552 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2025-09-23 19:15:53.243561 | orchestrator | 2025-09-23 19:15:53.243571 | orchestrator | PLAY [Deploy k3s worker nodes] ************************************************* 2025-09-23 19:15:53.243580 | orchestrator | 2025-09-23 19:15:53.243590 | orchestrator | TASK [k3s_agent : Validating arguments against arg spec 'main' - Setup k3s agents] *** 2025-09-23 19:15:53.243599 | orchestrator | Tuesday 23 September 2025 19:14:33 +0000 (0:00:03.054) 0:02:18.966 ***** 2025-09-23 19:15:53.243609 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.243619 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.243671 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.243681 | orchestrator | 2025-09-23 19:15:53.243691 | orchestrator | TASK [k3s_agent : Check if system is PXE-booted] ******************************* 2025-09-23 19:15:53.243700 | orchestrator | Tuesday 23 September 2025 19:14:33 +0000 (0:00:00.423) 0:02:19.389 ***** 2025-09-23 19:15:53.243710 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.243719 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.243729 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.243738 | orchestrator | 2025-09-23 19:15:53.243748 | orchestrator | TASK [k3s_agent : Set fact for PXE-booted system] ****************************** 2025-09-23 19:15:53.243758 | orchestrator | Tuesday 23 September 2025 19:14:34 +0000 (0:00:00.598) 0:02:19.988 ***** 2025-09-23 19:15:53.243767 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.243776 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.243784 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.243792 | orchestrator | 2025-09-23 19:15:53.243800 | orchestrator | TASK [k3s_agent : Include http_proxy configuration tasks] ********************** 2025-09-23 19:15:53.243807 | orchestrator | Tuesday 23 September 2025 19:14:34 +0000 (0:00:00.374) 0:02:20.362 ***** 2025-09-23 19:15:53.243815 | orchestrator | included: /ansible/roles/k3s_agent/tasks/http_proxy.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:15:53.243823 | orchestrator | 2025-09-23 19:15:53.243831 | orchestrator | TASK [k3s_agent : Create k3s-node.service.d directory] ************************* 2025-09-23 19:15:53.243839 | orchestrator | Tuesday 23 September 2025 19:14:35 +0000 (0:00:00.570) 0:02:20.933 ***** 2025-09-23 19:15:53.243847 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.243855 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.243863 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.243871 | orchestrator | 2025-09-23 19:15:53.243878 | orchestrator | TASK [k3s_agent : Copy K3s http_proxy conf file] ******************************* 2025-09-23 19:15:53.243886 | orchestrator | Tuesday 23 September 2025 19:14:35 +0000 (0:00:00.359) 0:02:21.292 ***** 2025-09-23 19:15:53.243894 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.243902 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.243910 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.243918 | orchestrator | 2025-09-23 19:15:53.243925 | orchestrator | TASK [k3s_agent : Deploy K3s http_proxy conf] ********************************** 2025-09-23 19:15:53.243933 | orchestrator | Tuesday 23 September 2025 19:14:36 +0000 (0:00:00.348) 0:02:21.641 ***** 2025-09-23 19:15:53.243941 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.243949 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.243957 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.243970 | orchestrator | 2025-09-23 19:15:53.243978 | orchestrator | TASK [k3s_agent : Create /etc/rancher/k3s directory] *************************** 2025-09-23 19:15:53.243986 | orchestrator | Tuesday 23 September 2025 19:14:36 +0000 (0:00:00.290) 0:02:21.931 ***** 2025-09-23 19:15:53.243993 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.244001 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.244009 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.244017 | orchestrator | 2025-09-23 19:15:53.244025 | orchestrator | TASK [k3s_agent : Create custom resolv.conf for k3s] *************************** 2025-09-23 19:15:53.244033 | orchestrator | Tuesday 23 September 2025 19:14:37 +0000 (0:00:00.835) 0:02:22.767 ***** 2025-09-23 19:15:53.244040 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.244048 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.244056 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.244064 | orchestrator | 2025-09-23 19:15:53.244072 | orchestrator | TASK [k3s_agent : Configure the k3s service] *********************************** 2025-09-23 19:15:53.244079 | orchestrator | Tuesday 23 September 2025 19:14:38 +0000 (0:00:01.209) 0:02:23.976 ***** 2025-09-23 19:15:53.244087 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.244095 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.244103 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.244111 | orchestrator | 2025-09-23 19:15:53.244122 | orchestrator | TASK [k3s_agent : Manage k3s service] ****************************************** 2025-09-23 19:15:53.244130 | orchestrator | Tuesday 23 September 2025 19:14:39 +0000 (0:00:01.372) 0:02:25.348 ***** 2025-09-23 19:15:53.244138 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:15:53.244146 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:15:53.244154 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:15:53.244162 | orchestrator | 2025-09-23 19:15:53.244169 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2025-09-23 19:15:53.244177 | orchestrator | 2025-09-23 19:15:53.244185 | orchestrator | TASK [Get home directory of operator user] ************************************* 2025-09-23 19:15:53.244193 | orchestrator | Tuesday 23 September 2025 19:14:51 +0000 (0:00:12.249) 0:02:37.598 ***** 2025-09-23 19:15:53.244201 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.244209 | orchestrator | 2025-09-23 19:15:53.244221 | orchestrator | TASK [Create .kube directory] ************************************************** 2025-09-23 19:15:53.244229 | orchestrator | Tuesday 23 September 2025 19:14:52 +0000 (0:00:00.736) 0:02:38.334 ***** 2025-09-23 19:15:53.244237 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244245 | orchestrator | 2025-09-23 19:15:53.244253 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2025-09-23 19:15:53.244261 | orchestrator | Tuesday 23 September 2025 19:14:53 +0000 (0:00:00.452) 0:02:38.787 ***** 2025-09-23 19:15:53.244269 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2025-09-23 19:15:53.244276 | orchestrator | 2025-09-23 19:15:53.244284 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2025-09-23 19:15:53.244292 | orchestrator | Tuesday 23 September 2025 19:14:53 +0000 (0:00:00.548) 0:02:39.336 ***** 2025-09-23 19:15:53.244300 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244308 | orchestrator | 2025-09-23 19:15:53.244316 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2025-09-23 19:15:53.244324 | orchestrator | Tuesday 23 September 2025 19:14:54 +0000 (0:00:00.837) 0:02:40.174 ***** 2025-09-23 19:15:53.244331 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244339 | orchestrator | 2025-09-23 19:15:53.244347 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2025-09-23 19:15:53.244355 | orchestrator | Tuesday 23 September 2025 19:14:55 +0000 (0:00:00.569) 0:02:40.743 ***** 2025-09-23 19:15:53.244362 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-23 19:15:53.244370 | orchestrator | 2025-09-23 19:15:53.244378 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2025-09-23 19:15:53.244386 | orchestrator | Tuesday 23 September 2025 19:14:56 +0000 (0:00:01.669) 0:02:42.413 ***** 2025-09-23 19:15:53.244398 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-23 19:15:53.244406 | orchestrator | 2025-09-23 19:15:53.244414 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2025-09-23 19:15:53.244422 | orchestrator | Tuesday 23 September 2025 19:14:57 +0000 (0:00:00.826) 0:02:43.240 ***** 2025-09-23 19:15:53.244430 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244438 | orchestrator | 2025-09-23 19:15:53.244445 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2025-09-23 19:15:53.244453 | orchestrator | Tuesday 23 September 2025 19:14:58 +0000 (0:00:00.397) 0:02:43.638 ***** 2025-09-23 19:15:53.244461 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244469 | orchestrator | 2025-09-23 19:15:53.244477 | orchestrator | PLAY [Apply role kubectl] ****************************************************** 2025-09-23 19:15:53.244485 | orchestrator | 2025-09-23 19:15:53.244492 | orchestrator | TASK [kubectl : Gather variables for each operating system] ******************** 2025-09-23 19:15:53.244500 | orchestrator | Tuesday 23 September 2025 19:14:58 +0000 (0:00:00.545) 0:02:44.183 ***** 2025-09-23 19:15:53.244508 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.244516 | orchestrator | 2025-09-23 19:15:53.244524 | orchestrator | TASK [kubectl : Include distribution specific install tasks] ******************* 2025-09-23 19:15:53.244532 | orchestrator | Tuesday 23 September 2025 19:14:58 +0000 (0:00:00.134) 0:02:44.318 ***** 2025-09-23 19:15:53.244540 | orchestrator | included: /ansible/roles/kubectl/tasks/install-Debian-family.yml for testbed-manager 2025-09-23 19:15:53.244547 | orchestrator | 2025-09-23 19:15:53.244555 | orchestrator | TASK [kubectl : Remove old architecture-dependent repository] ****************** 2025-09-23 19:15:53.244563 | orchestrator | Tuesday 23 September 2025 19:14:58 +0000 (0:00:00.187) 0:02:44.506 ***** 2025-09-23 19:15:53.244571 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.244579 | orchestrator | 2025-09-23 19:15:53.244587 | orchestrator | TASK [kubectl : Install apt-transport-https package] *************************** 2025-09-23 19:15:53.244594 | orchestrator | Tuesday 23 September 2025 19:14:59 +0000 (0:00:00.638) 0:02:45.144 ***** 2025-09-23 19:15:53.244602 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.244610 | orchestrator | 2025-09-23 19:15:53.244618 | orchestrator | TASK [kubectl : Add repository gpg key] **************************************** 2025-09-23 19:15:53.244637 | orchestrator | Tuesday 23 September 2025 19:15:00 +0000 (0:00:01.329) 0:02:46.474 ***** 2025-09-23 19:15:53.244644 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244652 | orchestrator | 2025-09-23 19:15:53.244660 | orchestrator | TASK [kubectl : Set permissions of gpg key] ************************************ 2025-09-23 19:15:53.244668 | orchestrator | Tuesday 23 September 2025 19:15:01 +0000 (0:00:00.756) 0:02:47.231 ***** 2025-09-23 19:15:53.244676 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.244684 | orchestrator | 2025-09-23 19:15:53.244691 | orchestrator | TASK [kubectl : Add repository Debian] ***************************************** 2025-09-23 19:15:53.244699 | orchestrator | Tuesday 23 September 2025 19:15:02 +0000 (0:00:00.527) 0:02:47.758 ***** 2025-09-23 19:15:53.244707 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244715 | orchestrator | 2025-09-23 19:15:53.244723 | orchestrator | TASK [kubectl : Install required packages] ************************************* 2025-09-23 19:15:53.244731 | orchestrator | Tuesday 23 September 2025 19:15:09 +0000 (0:00:07.345) 0:02:55.103 ***** 2025-09-23 19:15:53.244738 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.244746 | orchestrator | 2025-09-23 19:15:53.244754 | orchestrator | TASK [kubectl : Remove kubectl symlink] **************************************** 2025-09-23 19:15:53.244762 | orchestrator | Tuesday 23 September 2025 19:15:21 +0000 (0:00:12.440) 0:03:07.544 ***** 2025-09-23 19:15:53.244770 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.244778 | orchestrator | 2025-09-23 19:15:53.244789 | orchestrator | PLAY [Run post actions on master nodes] **************************************** 2025-09-23 19:15:53.244797 | orchestrator | 2025-09-23 19:15:53.244805 | orchestrator | TASK [k3s_server_post : Validating arguments against arg spec 'main' - Configure k3s cluster] *** 2025-09-23 19:15:53.244813 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.483) 0:03:08.027 ***** 2025-09-23 19:15:53.244825 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.244833 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.244840 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.244848 | orchestrator | 2025-09-23 19:15:53.244856 | orchestrator | TASK [k3s_server_post : Deploy calico] ***************************************** 2025-09-23 19:15:53.244864 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.351) 0:03:08.379 ***** 2025-09-23 19:15:53.244876 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.244884 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.244892 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.244900 | orchestrator | 2025-09-23 19:15:53.244908 | orchestrator | TASK [k3s_server_post : Deploy cilium] ***************************************** 2025-09-23 19:15:53.244916 | orchestrator | Tuesday 23 September 2025 19:15:23 +0000 (0:00:00.461) 0:03:08.841 ***** 2025-09-23 19:15:53.244924 | orchestrator | included: /ansible/roles/k3s_server_post/tasks/cilium.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:15:53.244932 | orchestrator | 2025-09-23 19:15:53.244939 | orchestrator | TASK [k3s_server_post : Create tmp directory on first master] ****************** 2025-09-23 19:15:53.244947 | orchestrator | Tuesday 23 September 2025 19:15:23 +0000 (0:00:00.699) 0:03:09.540 ***** 2025-09-23 19:15:53.244955 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.244963 | orchestrator | 2025-09-23 19:15:53.244971 | orchestrator | TASK [k3s_server_post : Check if Cilium CLI is installed] ********************** 2025-09-23 19:15:53.244979 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.177) 0:03:09.718 ***** 2025-09-23 19:15:53.244986 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.244994 | orchestrator | 2025-09-23 19:15:53.245002 | orchestrator | TASK [k3s_server_post : Check for Cilium CLI version in command output] ******** 2025-09-23 19:15:53.245010 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.203) 0:03:09.922 ***** 2025-09-23 19:15:53.245018 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245026 | orchestrator | 2025-09-23 19:15:53.245033 | orchestrator | TASK [k3s_server_post : Get latest stable Cilium CLI version file] ************* 2025-09-23 19:15:53.245041 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.199) 0:03:10.122 ***** 2025-09-23 19:15:53.245049 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245057 | orchestrator | 2025-09-23 19:15:53.245065 | orchestrator | TASK [k3s_server_post : Read Cilium CLI stable version from file] ************** 2025-09-23 19:15:53.245073 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.201) 0:03:10.323 ***** 2025-09-23 19:15:53.245080 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245088 | orchestrator | 2025-09-23 19:15:53.245096 | orchestrator | TASK [k3s_server_post : Log installed Cilium CLI version] ********************** 2025-09-23 19:15:53.245104 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.177) 0:03:10.501 ***** 2025-09-23 19:15:53.245112 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245120 | orchestrator | 2025-09-23 19:15:53.245127 | orchestrator | TASK [k3s_server_post : Log latest stable Cilium CLI version] ****************** 2025-09-23 19:15:53.245135 | orchestrator | Tuesday 23 September 2025 19:15:25 +0000 (0:00:00.175) 0:03:10.677 ***** 2025-09-23 19:15:53.245143 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245151 | orchestrator | 2025-09-23 19:15:53.245159 | orchestrator | TASK [k3s_server_post : Determine if Cilium CLI needs installation or update] *** 2025-09-23 19:15:53.245166 | orchestrator | Tuesday 23 September 2025 19:15:25 +0000 (0:00:00.175) 0:03:10.852 ***** 2025-09-23 19:15:53.245174 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245182 | orchestrator | 2025-09-23 19:15:53.245190 | orchestrator | TASK [k3s_server_post : Set architecture variable] ***************************** 2025-09-23 19:15:53.245198 | orchestrator | Tuesday 23 September 2025 19:15:25 +0000 (0:00:00.205) 0:03:11.058 ***** 2025-09-23 19:15:53.245206 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245213 | orchestrator | 2025-09-23 19:15:53.245221 | orchestrator | TASK [k3s_server_post : Download Cilium CLI and checksum] ********************** 2025-09-23 19:15:53.245234 | orchestrator | Tuesday 23 September 2025 19:15:25 +0000 (0:00:00.497) 0:03:11.556 ***** 2025-09-23 19:15:53.245242 | orchestrator | skipping: [testbed-node-0] => (item=.tar.gz)  2025-09-23 19:15:53.245250 | orchestrator | skipping: [testbed-node-0] => (item=.tar.gz.sha256sum)  2025-09-23 19:15:53.245258 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245266 | orchestrator | 2025-09-23 19:15:53.245274 | orchestrator | TASK [k3s_server_post : Verify the downloaded tarball] ************************* 2025-09-23 19:15:53.245282 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:00.287) 0:03:11.843 ***** 2025-09-23 19:15:53.245289 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245297 | orchestrator | 2025-09-23 19:15:53.245305 | orchestrator | TASK [k3s_server_post : Extract Cilium CLI to /usr/local/bin] ****************** 2025-09-23 19:15:53.245313 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:00.246) 0:03:12.089 ***** 2025-09-23 19:15:53.245321 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245329 | orchestrator | 2025-09-23 19:15:53.245337 | orchestrator | TASK [k3s_server_post : Remove downloaded tarball and checksum file] *********** 2025-09-23 19:15:53.245345 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:00.183) 0:03:12.273 ***** 2025-09-23 19:15:53.245352 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245360 | orchestrator | 2025-09-23 19:15:53.245368 | orchestrator | TASK [k3s_server_post : Wait for connectivity to kube VIP] ********************* 2025-09-23 19:15:53.245376 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:00.179) 0:03:12.452 ***** 2025-09-23 19:15:53.245384 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245391 | orchestrator | 2025-09-23 19:15:53.245399 | orchestrator | TASK [k3s_server_post : Fail if kube VIP not reachable] ************************ 2025-09-23 19:15:53.245407 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.216) 0:03:12.669 ***** 2025-09-23 19:15:53.245415 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245423 | orchestrator | 2025-09-23 19:15:53.245434 | orchestrator | TASK [k3s_server_post : Test for existing Cilium install] ********************** 2025-09-23 19:15:53.245442 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.198) 0:03:12.867 ***** 2025-09-23 19:15:53.245450 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245457 | orchestrator | 2025-09-23 19:15:53.245465 | orchestrator | TASK [k3s_server_post : Check Cilium version] ********************************** 2025-09-23 19:15:53.245473 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.169) 0:03:13.036 ***** 2025-09-23 19:15:53.245481 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245489 | orchestrator | 2025-09-23 19:15:53.245496 | orchestrator | TASK [k3s_server_post : Parse installed Cilium version] ************************ 2025-09-23 19:15:53.245504 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.194) 0:03:13.231 ***** 2025-09-23 19:15:53.245515 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245523 | orchestrator | 2025-09-23 19:15:53.245531 | orchestrator | TASK [k3s_server_post : Determine if Cilium needs update] ********************** 2025-09-23 19:15:53.245539 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.185) 0:03:13.417 ***** 2025-09-23 19:15:53.245547 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245555 | orchestrator | 2025-09-23 19:15:53.245563 | orchestrator | TASK [k3s_server_post : Log result] ******************************************** 2025-09-23 19:15:53.245571 | orchestrator | Tuesday 23 September 2025 19:15:28 +0000 (0:00:00.200) 0:03:13.618 ***** 2025-09-23 19:15:53.245578 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245586 | orchestrator | 2025-09-23 19:15:53.245594 | orchestrator | TASK [k3s_server_post : Install Cilium] **************************************** 2025-09-23 19:15:53.245602 | orchestrator | Tuesday 23 September 2025 19:15:28 +0000 (0:00:00.219) 0:03:13.837 ***** 2025-09-23 19:15:53.245610 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245617 | orchestrator | 2025-09-23 19:15:53.245636 | orchestrator | TASK [k3s_server_post : Wait for Cilium resources] ***************************** 2025-09-23 19:15:53.245644 | orchestrator | Tuesday 23 September 2025 19:15:28 +0000 (0:00:00.207) 0:03:14.045 ***** 2025-09-23 19:15:53.245657 | orchestrator | skipping: [testbed-node-0] => (item=deployment/cilium-operator)  2025-09-23 19:15:53.245665 | orchestrator | skipping: [testbed-node-0] => (item=daemonset/cilium)  2025-09-23 19:15:53.245673 | orchestrator | skipping: [testbed-node-0] => (item=deployment/hubble-relay)  2025-09-23 19:15:53.245680 | orchestrator | skipping: [testbed-node-0] => (item=deployment/hubble-ui)  2025-09-23 19:15:53.245688 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245696 | orchestrator | 2025-09-23 19:15:53.245704 | orchestrator | TASK [k3s_server_post : Set _cilium_bgp_neighbors fact] ************************ 2025-09-23 19:15:53.245712 | orchestrator | Tuesday 23 September 2025 19:15:29 +0000 (0:00:01.098) 0:03:15.144 ***** 2025-09-23 19:15:53.245720 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245727 | orchestrator | 2025-09-23 19:15:53.245735 | orchestrator | TASK [k3s_server_post : Copy BGP manifests to first master] ******************** 2025-09-23 19:15:53.245743 | orchestrator | Tuesday 23 September 2025 19:15:29 +0000 (0:00:00.248) 0:03:15.392 ***** 2025-09-23 19:15:53.245751 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245759 | orchestrator | 2025-09-23 19:15:53.245767 | orchestrator | TASK [k3s_server_post : Apply BGP manifests] *********************************** 2025-09-23 19:15:53.245774 | orchestrator | Tuesday 23 September 2025 19:15:30 +0000 (0:00:00.247) 0:03:15.639 ***** 2025-09-23 19:15:53.245782 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245790 | orchestrator | 2025-09-23 19:15:53.245798 | orchestrator | TASK [k3s_server_post : Print error message if BGP manifests application fails] *** 2025-09-23 19:15:53.245806 | orchestrator | Tuesday 23 September 2025 19:15:30 +0000 (0:00:00.263) 0:03:15.902 ***** 2025-09-23 19:15:53.245813 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245821 | orchestrator | 2025-09-23 19:15:53.245829 | orchestrator | TASK [k3s_server_post : Test for BGP config resources] ************************* 2025-09-23 19:15:53.245837 | orchestrator | Tuesday 23 September 2025 19:15:30 +0000 (0:00:00.247) 0:03:16.150 ***** 2025-09-23 19:15:53.245845 | orchestrator | skipping: [testbed-node-0] => (item=kubectl get CiliumBGPPeeringPolicy.cilium.io)  2025-09-23 19:15:53.245852 | orchestrator | skipping: [testbed-node-0] => (item=kubectl get CiliumLoadBalancerIPPool.cilium.io)  2025-09-23 19:15:53.245860 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245868 | orchestrator | 2025-09-23 19:15:53.245876 | orchestrator | TASK [k3s_server_post : Deploy metallb pool] *********************************** 2025-09-23 19:15:53.245884 | orchestrator | Tuesday 23 September 2025 19:15:30 +0000 (0:00:00.292) 0:03:16.442 ***** 2025-09-23 19:15:53.245891 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.245899 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.245907 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.245915 | orchestrator | 2025-09-23 19:15:53.245923 | orchestrator | TASK [k3s_server_post : Remove tmp directory used for manifests] *************** 2025-09-23 19:15:53.245931 | orchestrator | Tuesday 23 September 2025 19:15:31 +0000 (0:00:00.310) 0:03:16.753 ***** 2025-09-23 19:15:53.245939 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.245946 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.245954 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.245962 | orchestrator | 2025-09-23 19:15:53.245970 | orchestrator | PLAY [Apply role k9s] ********************************************************** 2025-09-23 19:15:53.245978 | orchestrator | 2025-09-23 19:15:53.245986 | orchestrator | TASK [k9s : Gather variables for each operating system] ************************ 2025-09-23 19:15:53.245993 | orchestrator | Tuesday 23 September 2025 19:15:32 +0000 (0:00:01.084) 0:03:17.837 ***** 2025-09-23 19:15:53.246001 | orchestrator | ok: [testbed-manager] 2025-09-23 19:15:53.246009 | orchestrator | 2025-09-23 19:15:53.246036 | orchestrator | TASK [k9s : Include distribution specific install tasks] *********************** 2025-09-23 19:15:53.246045 | orchestrator | Tuesday 23 September 2025 19:15:32 +0000 (0:00:00.147) 0:03:17.985 ***** 2025-09-23 19:15:53.246053 | orchestrator | included: /ansible/roles/k9s/tasks/install-Debian-family.yml for testbed-manager 2025-09-23 19:15:53.246061 | orchestrator | 2025-09-23 19:15:53.246069 | orchestrator | TASK [k9s : Install k9s packages] ********************************************** 2025-09-23 19:15:53.246081 | orchestrator | Tuesday 23 September 2025 19:15:32 +0000 (0:00:00.240) 0:03:18.225 ***** 2025-09-23 19:15:53.246089 | orchestrator | changed: [testbed-manager] 2025-09-23 19:15:53.246097 | orchestrator | 2025-09-23 19:15:53.246108 | orchestrator | PLAY [Manage labels, annotations, and taints on all k3s nodes] ***************** 2025-09-23 19:15:53.246116 | orchestrator | 2025-09-23 19:15:53.246124 | orchestrator | TASK [Merge labels, annotations, and taints] *********************************** 2025-09-23 19:15:53.246132 | orchestrator | Tuesday 23 September 2025 19:15:38 +0000 (0:00:05.405) 0:03:23.631 ***** 2025-09-23 19:15:53.246140 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:15:53.246148 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:15:53.246156 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:15:53.246164 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:15:53.246171 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:15:53.246179 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:15:53.246187 | orchestrator | 2025-09-23 19:15:53.246199 | orchestrator | TASK [Manage labels] *********************************************************** 2025-09-23 19:15:53.246208 | orchestrator | Tuesday 23 September 2025 19:15:39 +0000 (0:00:01.033) 0:03:24.664 ***** 2025-09-23 19:15:53.246216 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2025-09-23 19:15:53.246224 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2025-09-23 19:15:53.246232 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2025-09-23 19:15:53.246240 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2025-09-23 19:15:53.246247 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2025-09-23 19:15:53.246255 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2025-09-23 19:15:53.246263 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2025-09-23 19:15:53.246271 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2025-09-23 19:15:53.246278 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2025-09-23 19:15:53.246286 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=openstack-control-plane=enabled) 2025-09-23 19:15:53.246294 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=openstack-control-plane=enabled) 2025-09-23 19:15:53.246302 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2025-09-23 19:15:53.246309 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=openstack-control-plane=enabled) 2025-09-23 19:15:53.246317 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2025-09-23 19:15:53.246325 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2025-09-23 19:15:53.246333 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2025-09-23 19:15:53.246341 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2025-09-23 19:15:53.246349 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2025-09-23 19:15:53.246357 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2025-09-23 19:15:53.246365 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2025-09-23 19:15:53.246372 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2025-09-23 19:15:53.246380 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2025-09-23 19:15:53.246388 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2025-09-23 19:15:53.246396 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2025-09-23 19:15:53.246408 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2025-09-23 19:15:53.246416 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2025-09-23 19:15:53.246423 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2025-09-23 19:15:53.246431 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2025-09-23 19:15:53.246439 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2025-09-23 19:15:53.246447 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2025-09-23 19:15:53.246455 | orchestrator | 2025-09-23 19:15:53.246463 | orchestrator | TASK [Manage annotations] ****************************************************** 2025-09-23 19:15:53.246471 | orchestrator | Tuesday 23 September 2025 19:15:50 +0000 (0:00:11.112) 0:03:35.777 ***** 2025-09-23 19:15:53.246478 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.246486 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.246494 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.246502 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.246510 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.246517 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.246525 | orchestrator | 2025-09-23 19:15:53.246533 | orchestrator | TASK [Manage taints] *********************************************************** 2025-09-23 19:15:53.246541 | orchestrator | Tuesday 23 September 2025 19:15:50 +0000 (0:00:00.670) 0:03:36.447 ***** 2025-09-23 19:15:53.246549 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:15:53.246557 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:15:53.246564 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:15:53.246572 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:15:53.246580 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:15:53.246591 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:15:53.246599 | orchestrator | 2025-09-23 19:15:53.246607 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:15:53.246615 | orchestrator | testbed-manager : ok=21  changed=11  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:15:53.246653 | orchestrator | testbed-node-0 : ok=42  changed=20  unreachable=0 failed=0 skipped=45  rescued=0 ignored=0 2025-09-23 19:15:53.246667 | orchestrator | testbed-node-1 : ok=39  changed=17  unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2025-09-23 19:15:53.246676 | orchestrator | testbed-node-2 : ok=39  changed=17  unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2025-09-23 19:15:53.246684 | orchestrator | testbed-node-3 : ok=19  changed=9  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2025-09-23 19:15:53.246692 | orchestrator | testbed-node-4 : ok=19  changed=9  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2025-09-23 19:15:53.246699 | orchestrator | testbed-node-5 : ok=19  changed=9  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2025-09-23 19:15:53.246707 | orchestrator | 2025-09-23 19:15:53.246715 | orchestrator | 2025-09-23 19:15:53.246723 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:15:53.246731 | orchestrator | Tuesday 23 September 2025 19:15:51 +0000 (0:00:00.441) 0:03:36.889 ***** 2025-09-23 19:15:53.246738 | orchestrator | =============================================================================== 2025-09-23 19:15:53.246746 | orchestrator | k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails) -- 55.09s 2025-09-23 19:15:53.246754 | orchestrator | k3s_server : Enable and check K3s service ------------------------------ 24.78s 2025-09-23 19:15:53.246769 | orchestrator | kubectl : Install required packages ------------------------------------ 12.44s 2025-09-23 19:15:53.246777 | orchestrator | k3s_agent : Manage k3s service ----------------------------------------- 12.25s 2025-09-23 19:15:53.246785 | orchestrator | Manage labels ---------------------------------------------------------- 11.11s 2025-09-23 19:15:53.246792 | orchestrator | kubectl : Add repository Debian ----------------------------------------- 7.35s 2025-09-23 19:15:53.246800 | orchestrator | k3s_download : Download k3s binary x64 ---------------------------------- 5.55s 2025-09-23 19:15:53.246808 | orchestrator | k9s : Install k9s packages ---------------------------------------------- 5.41s 2025-09-23 19:15:53.246816 | orchestrator | k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start --- 3.05s 2025-09-23 19:15:53.246823 | orchestrator | k3s_download : Download k3s binary armhf -------------------------------- 3.01s 2025-09-23 19:15:53.246831 | orchestrator | k3s_server : Set _kube_vip_bgp_peers fact ------------------------------- 2.74s 2025-09-23 19:15:53.246839 | orchestrator | k3s_custom_registries : Create directory /etc/rancher/k3s --------------- 2.50s 2025-09-23 19:15:53.246847 | orchestrator | k3s_server : Init cluster inside the transient k3s-init service --------- 2.19s 2025-09-23 19:15:53.246855 | orchestrator | k3s_download : Download k3s binary arm64 -------------------------------- 1.80s 2025-09-23 19:15:53.246862 | orchestrator | k3s_prereq : Enable IPv4 forwarding ------------------------------------- 1.80s 2025-09-23 19:15:53.246870 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.67s 2025-09-23 19:15:53.246878 | orchestrator | k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml --- 1.64s 2025-09-23 19:15:53.246885 | orchestrator | k3s_server : Copy vip manifest to first master -------------------------- 1.64s 2025-09-23 19:15:53.246893 | orchestrator | k3s_server : Create custom resolv.conf for k3s -------------------------- 1.53s 2025-09-23 19:15:53.246901 | orchestrator | k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers --- 1.44s 2025-09-23 19:15:53.246909 | orchestrator | 2025-09-23 19:15:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:56.273956 | orchestrator | 2025-09-23 19:15:56 | INFO  | Task 87fad0cd-c392-4aeb-9be7-0792a9af9b74 is in state STARTED 2025-09-23 19:15:56.275339 | orchestrator | 2025-09-23 19:15:56 | INFO  | Task 840b5689-b177-42e5-841c-ef7d745de16e is in state STARTED 2025-09-23 19:15:56.276994 | orchestrator | 2025-09-23 19:15:56 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:56.277612 | orchestrator | 2025-09-23 19:15:56 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:56.277690 | orchestrator | 2025-09-23 19:15:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:15:59.311955 | orchestrator | 2025-09-23 19:15:59 | INFO  | Task 87fad0cd-c392-4aeb-9be7-0792a9af9b74 is in state STARTED 2025-09-23 19:15:59.312382 | orchestrator | 2025-09-23 19:15:59 | INFO  | Task 840b5689-b177-42e5-841c-ef7d745de16e is in state SUCCESS 2025-09-23 19:15:59.313400 | orchestrator | 2025-09-23 19:15:59 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:15:59.316218 | orchestrator | 2025-09-23 19:15:59 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:15:59.316245 | orchestrator | 2025-09-23 19:15:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:02.350674 | orchestrator | 2025-09-23 19:16:02 | INFO  | Task 87fad0cd-c392-4aeb-9be7-0792a9af9b74 is in state SUCCESS 2025-09-23 19:16:02.351450 | orchestrator | 2025-09-23 19:16:02 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:02.351684 | orchestrator | 2025-09-23 19:16:02 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:02.351866 | orchestrator | 2025-09-23 19:16:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:05.397036 | orchestrator | 2025-09-23 19:16:05 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:05.398716 | orchestrator | 2025-09-23 19:16:05 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:05.398867 | orchestrator | 2025-09-23 19:16:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:08.439501 | orchestrator | 2025-09-23 19:16:08 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:08.440384 | orchestrator | 2025-09-23 19:16:08 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:08.440420 | orchestrator | 2025-09-23 19:16:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:11.509756 | orchestrator | 2025-09-23 19:16:11 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:11.511255 | orchestrator | 2025-09-23 19:16:11 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:11.511290 | orchestrator | 2025-09-23 19:16:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:14.560047 | orchestrator | 2025-09-23 19:16:14 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:14.561250 | orchestrator | 2025-09-23 19:16:14 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:14.561418 | orchestrator | 2025-09-23 19:16:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:17.599000 | orchestrator | 2025-09-23 19:16:17 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:17.601264 | orchestrator | 2025-09-23 19:16:17 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:17.601298 | orchestrator | 2025-09-23 19:16:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:20.639209 | orchestrator | 2025-09-23 19:16:20 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:20.640754 | orchestrator | 2025-09-23 19:16:20 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:20.641054 | orchestrator | 2025-09-23 19:16:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:23.689478 | orchestrator | 2025-09-23 19:16:23 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:23.691042 | orchestrator | 2025-09-23 19:16:23 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:23.691479 | orchestrator | 2025-09-23 19:16:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:26.752123 | orchestrator | 2025-09-23 19:16:26 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:26.754165 | orchestrator | 2025-09-23 19:16:26 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:26.754222 | orchestrator | 2025-09-23 19:16:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:29.797338 | orchestrator | 2025-09-23 19:16:29 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:29.798328 | orchestrator | 2025-09-23 19:16:29 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:29.798364 | orchestrator | 2025-09-23 19:16:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:32.839736 | orchestrator | 2025-09-23 19:16:32 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:32.840321 | orchestrator | 2025-09-23 19:16:32 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:32.840369 | orchestrator | 2025-09-23 19:16:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:35.885077 | orchestrator | 2025-09-23 19:16:35 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:35.886798 | orchestrator | 2025-09-23 19:16:35 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:35.886998 | orchestrator | 2025-09-23 19:16:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:38.932141 | orchestrator | 2025-09-23 19:16:38 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:38.933332 | orchestrator | 2025-09-23 19:16:38 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:38.933363 | orchestrator | 2025-09-23 19:16:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:41.983513 | orchestrator | 2025-09-23 19:16:41 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:41.984684 | orchestrator | 2025-09-23 19:16:41 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:41.984839 | orchestrator | 2025-09-23 19:16:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:45.028094 | orchestrator | 2025-09-23 19:16:45 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:45.028188 | orchestrator | 2025-09-23 19:16:45 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:45.028203 | orchestrator | 2025-09-23 19:16:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:48.063666 | orchestrator | 2025-09-23 19:16:48 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:48.064747 | orchestrator | 2025-09-23 19:16:48 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:48.064952 | orchestrator | 2025-09-23 19:16:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:51.109936 | orchestrator | 2025-09-23 19:16:51 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:51.110087 | orchestrator | 2025-09-23 19:16:51 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:51.110107 | orchestrator | 2025-09-23 19:16:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:54.155305 | orchestrator | 2025-09-23 19:16:54 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:54.157376 | orchestrator | 2025-09-23 19:16:54 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:54.157427 | orchestrator | 2025-09-23 19:16:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:16:57.201363 | orchestrator | 2025-09-23 19:16:57 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:16:57.202394 | orchestrator | 2025-09-23 19:16:57 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:16:57.202438 | orchestrator | 2025-09-23 19:16:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:00.248144 | orchestrator | 2025-09-23 19:17:00 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:00.248274 | orchestrator | 2025-09-23 19:17:00 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:00.248301 | orchestrator | 2025-09-23 19:17:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:03.281856 | orchestrator | 2025-09-23 19:17:03 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:03.283561 | orchestrator | 2025-09-23 19:17:03 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:03.283599 | orchestrator | 2025-09-23 19:17:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:06.330417 | orchestrator | 2025-09-23 19:17:06 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:06.330590 | orchestrator | 2025-09-23 19:17:06 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:06.331726 | orchestrator | 2025-09-23 19:17:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:09.358280 | orchestrator | 2025-09-23 19:17:09 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:09.358399 | orchestrator | 2025-09-23 19:17:09 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:09.358691 | orchestrator | 2025-09-23 19:17:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:12.397044 | orchestrator | 2025-09-23 19:17:12 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:12.398829 | orchestrator | 2025-09-23 19:17:12 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:12.399340 | orchestrator | 2025-09-23 19:17:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:15.441243 | orchestrator | 2025-09-23 19:17:15 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:15.442715 | orchestrator | 2025-09-23 19:17:15 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:15.442753 | orchestrator | 2025-09-23 19:17:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:18.483947 | orchestrator | 2025-09-23 19:17:18 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:18.484035 | orchestrator | 2025-09-23 19:17:18 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:18.484046 | orchestrator | 2025-09-23 19:17:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:21.527773 | orchestrator | 2025-09-23 19:17:21 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:21.529718 | orchestrator | 2025-09-23 19:17:21 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:21.529754 | orchestrator | 2025-09-23 19:17:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:24.574980 | orchestrator | 2025-09-23 19:17:24 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:24.577937 | orchestrator | 2025-09-23 19:17:24 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:24.578003 | orchestrator | 2025-09-23 19:17:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:27.624126 | orchestrator | 2025-09-23 19:17:27 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:27.624742 | orchestrator | 2025-09-23 19:17:27 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:27.624774 | orchestrator | 2025-09-23 19:17:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:30.665059 | orchestrator | 2025-09-23 19:17:30 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:30.666618 | orchestrator | 2025-09-23 19:17:30 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:30.666662 | orchestrator | 2025-09-23 19:17:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:33.699112 | orchestrator | 2025-09-23 19:17:33 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:33.700770 | orchestrator | 2025-09-23 19:17:33 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:33.700816 | orchestrator | 2025-09-23 19:17:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:36.745369 | orchestrator | 2025-09-23 19:17:36 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:36.750434 | orchestrator | 2025-09-23 19:17:36 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:36.750795 | orchestrator | 2025-09-23 19:17:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:39.781960 | orchestrator | 2025-09-23 19:17:39 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:39.782339 | orchestrator | 2025-09-23 19:17:39 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:39.782370 | orchestrator | 2025-09-23 19:17:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:42.819009 | orchestrator | 2025-09-23 19:17:42 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:42.822203 | orchestrator | 2025-09-23 19:17:42 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:42.822241 | orchestrator | 2025-09-23 19:17:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:45.864414 | orchestrator | 2025-09-23 19:17:45 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:45.866591 | orchestrator | 2025-09-23 19:17:45 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:45.866642 | orchestrator | 2025-09-23 19:17:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:48.913903 | orchestrator | 2025-09-23 19:17:48 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:48.917785 | orchestrator | 2025-09-23 19:17:48 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:48.917873 | orchestrator | 2025-09-23 19:17:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:51.976631 | orchestrator | 2025-09-23 19:17:51 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:51.981069 | orchestrator | 2025-09-23 19:17:51 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:51.981118 | orchestrator | 2025-09-23 19:17:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:55.033598 | orchestrator | 2025-09-23 19:17:55 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:55.033671 | orchestrator | 2025-09-23 19:17:55 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:55.033678 | orchestrator | 2025-09-23 19:17:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:17:58.077636 | orchestrator | 2025-09-23 19:17:58 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:17:58.079112 | orchestrator | 2025-09-23 19:17:58 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:17:58.080274 | orchestrator | 2025-09-23 19:17:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:01.124748 | orchestrator | 2025-09-23 19:18:01 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:01.126790 | orchestrator | 2025-09-23 19:18:01 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:01.126862 | orchestrator | 2025-09-23 19:18:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:04.159997 | orchestrator | 2025-09-23 19:18:04 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:04.160755 | orchestrator | 2025-09-23 19:18:04 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:04.160920 | orchestrator | 2025-09-23 19:18:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:07.209191 | orchestrator | 2025-09-23 19:18:07 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:07.210683 | orchestrator | 2025-09-23 19:18:07 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:07.210927 | orchestrator | 2025-09-23 19:18:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:10.252947 | orchestrator | 2025-09-23 19:18:10 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:10.254345 | orchestrator | 2025-09-23 19:18:10 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:10.254379 | orchestrator | 2025-09-23 19:18:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:13.291042 | orchestrator | 2025-09-23 19:18:13 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:13.295294 | orchestrator | 2025-09-23 19:18:13 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:13.295573 | orchestrator | 2025-09-23 19:18:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:16.345973 | orchestrator | 2025-09-23 19:18:16 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:16.347988 | orchestrator | 2025-09-23 19:18:16 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:16.348084 | orchestrator | 2025-09-23 19:18:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:19.379791 | orchestrator | 2025-09-23 19:18:19 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:19.382275 | orchestrator | 2025-09-23 19:18:19 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:19.382314 | orchestrator | 2025-09-23 19:18:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:22.418760 | orchestrator | 2025-09-23 19:18:22 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:22.419260 | orchestrator | 2025-09-23 19:18:22 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:22.419295 | orchestrator | 2025-09-23 19:18:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:25.468001 | orchestrator | 2025-09-23 19:18:25 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state STARTED 2025-09-23 19:18:25.469074 | orchestrator | 2025-09-23 19:18:25 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:25.469268 | orchestrator | 2025-09-23 19:18:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:28.517673 | orchestrator | 2025-09-23 19:18:28.517983 | orchestrator | 2025-09-23 19:18:28.518120 | orchestrator | PLAY [Copy kubeconfig to the configuration repository] ************************* 2025-09-23 19:18:28.518136 | orchestrator | 2025-09-23 19:18:28.518147 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2025-09-23 19:18:28.518158 | orchestrator | Tuesday 23 September 2025 19:15:54 +0000 (0:00:00.123) 0:00:00.123 ***** 2025-09-23 19:18:28.518170 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2025-09-23 19:18:28.518207 | orchestrator | 2025-09-23 19:18:28.518218 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2025-09-23 19:18:28.518228 | orchestrator | Tuesday 23 September 2025 19:15:55 +0000 (0:00:00.662) 0:00:00.786 ***** 2025-09-23 19:18:28.518240 | orchestrator | changed: [testbed-manager] 2025-09-23 19:18:28.518255 | orchestrator | 2025-09-23 19:18:28.518265 | orchestrator | TASK [Change server address in the kubeconfig file] **************************** 2025-09-23 19:18:28.518275 | orchestrator | Tuesday 23 September 2025 19:15:56 +0000 (0:00:00.983) 0:00:01.769 ***** 2025-09-23 19:18:28.518285 | orchestrator | changed: [testbed-manager] 2025-09-23 19:18:28.518295 | orchestrator | 2025-09-23 19:18:28.518306 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:18:28.518317 | orchestrator | testbed-manager : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:18:28.518329 | orchestrator | 2025-09-23 19:18:28.518339 | orchestrator | 2025-09-23 19:18:28.518349 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:18:28.518358 | orchestrator | Tuesday 23 September 2025 19:15:56 +0000 (0:00:00.615) 0:00:02.384 ***** 2025-09-23 19:18:28.518392 | orchestrator | =============================================================================== 2025-09-23 19:18:28.518403 | orchestrator | Write kubeconfig file --------------------------------------------------- 0.98s 2025-09-23 19:18:28.518414 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.66s 2025-09-23 19:18:28.518424 | orchestrator | Change server address in the kubeconfig file ---------------------------- 0.62s 2025-09-23 19:18:28.518435 | orchestrator | 2025-09-23 19:18:28.518446 | orchestrator | 2025-09-23 19:18:28.518456 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2025-09-23 19:18:28.518465 | orchestrator | 2025-09-23 19:18:28.518475 | orchestrator | TASK [Get home directory of operator user] ************************************* 2025-09-23 19:18:28.518484 | orchestrator | Tuesday 23 September 2025 19:15:54 +0000 (0:00:00.151) 0:00:00.151 ***** 2025-09-23 19:18:28.518494 | orchestrator | ok: [testbed-manager] 2025-09-23 19:18:28.518505 | orchestrator | 2025-09-23 19:18:28.518515 | orchestrator | TASK [Create .kube directory] ************************************************** 2025-09-23 19:18:28.518525 | orchestrator | Tuesday 23 September 2025 19:15:55 +0000 (0:00:00.489) 0:00:00.641 ***** 2025-09-23 19:18:28.518535 | orchestrator | ok: [testbed-manager] 2025-09-23 19:18:28.518544 | orchestrator | 2025-09-23 19:18:28.518553 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2025-09-23 19:18:28.518561 | orchestrator | Tuesday 23 September 2025 19:15:55 +0000 (0:00:00.494) 0:00:01.136 ***** 2025-09-23 19:18:28.518570 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2025-09-23 19:18:28.518579 | orchestrator | 2025-09-23 19:18:28.518589 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2025-09-23 19:18:28.518599 | orchestrator | Tuesday 23 September 2025 19:15:56 +0000 (0:00:00.802) 0:00:01.938 ***** 2025-09-23 19:18:28.518608 | orchestrator | changed: [testbed-manager] 2025-09-23 19:18:28.518617 | orchestrator | 2025-09-23 19:18:28.518627 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2025-09-23 19:18:28.518636 | orchestrator | Tuesday 23 September 2025 19:15:57 +0000 (0:00:00.974) 0:00:02.913 ***** 2025-09-23 19:18:28.518645 | orchestrator | changed: [testbed-manager] 2025-09-23 19:18:28.518654 | orchestrator | 2025-09-23 19:18:28.518664 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2025-09-23 19:18:28.518674 | orchestrator | Tuesday 23 September 2025 19:15:57 +0000 (0:00:00.646) 0:00:03.560 ***** 2025-09-23 19:18:28.518685 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-23 19:18:28.518694 | orchestrator | 2025-09-23 19:18:28.518704 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2025-09-23 19:18:28.518715 | orchestrator | Tuesday 23 September 2025 19:15:59 +0000 (0:00:01.144) 0:00:04.704 ***** 2025-09-23 19:18:28.518723 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-23 19:18:28.518747 | orchestrator | 2025-09-23 19:18:28.518757 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2025-09-23 19:18:28.518766 | orchestrator | Tuesday 23 September 2025 19:15:59 +0000 (0:00:00.809) 0:00:05.514 ***** 2025-09-23 19:18:28.518776 | orchestrator | ok: [testbed-manager] 2025-09-23 19:18:28.518786 | orchestrator | 2025-09-23 19:18:28.518795 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2025-09-23 19:18:28.518806 | orchestrator | Tuesday 23 September 2025 19:16:00 +0000 (0:00:00.503) 0:00:06.018 ***** 2025-09-23 19:18:28.518817 | orchestrator | ok: [testbed-manager] 2025-09-23 19:18:28.518827 | orchestrator | 2025-09-23 19:18:28.518837 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:18:28.518848 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:18:28.518860 | orchestrator | 2025-09-23 19:18:28.518870 | orchestrator | 2025-09-23 19:18:28.518881 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:18:28.518892 | orchestrator | Tuesday 23 September 2025 19:16:00 +0000 (0:00:00.309) 0:00:06.328 ***** 2025-09-23 19:18:28.518902 | orchestrator | =============================================================================== 2025-09-23 19:18:28.518913 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.14s 2025-09-23 19:18:28.518924 | orchestrator | Write kubeconfig file --------------------------------------------------- 0.97s 2025-09-23 19:18:28.518935 | orchestrator | Change server address in the kubeconfig inside the manager service ------ 0.81s 2025-09-23 19:18:28.518968 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.80s 2025-09-23 19:18:28.518979 | orchestrator | Change server address in the kubeconfig --------------------------------- 0.65s 2025-09-23 19:18:28.518991 | orchestrator | Set KUBECONFIG environment variable ------------------------------------- 0.50s 2025-09-23 19:18:28.519002 | orchestrator | Create .kube directory -------------------------------------------------- 0.49s 2025-09-23 19:18:28.519013 | orchestrator | Get home directory of operator user ------------------------------------- 0.49s 2025-09-23 19:18:28.519025 | orchestrator | Enable kubectl command line completion ---------------------------------- 0.31s 2025-09-23 19:18:28.519035 | orchestrator | 2025-09-23 19:18:28.519047 | orchestrator | 2025-09-23 19:18:28.519057 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:18:28.519068 | orchestrator | 2025-09-23 19:18:28.519078 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:18:28.519089 | orchestrator | Tuesday 23 September 2025 19:12:54 +0000 (0:00:01.015) 0:00:01.015 ***** 2025-09-23 19:18:28.519100 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.519111 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.519121 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.519130 | orchestrator | 2025-09-23 19:18:28.519140 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:18:28.519149 | orchestrator | Tuesday 23 September 2025 19:12:55 +0000 (0:00:01.179) 0:00:02.195 ***** 2025-09-23 19:18:28.519160 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2025-09-23 19:18:28.519170 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2025-09-23 19:18:28.519181 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2025-09-23 19:18:28.519191 | orchestrator | 2025-09-23 19:18:28.519201 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2025-09-23 19:18:28.519212 | orchestrator | 2025-09-23 19:18:28.519223 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-09-23 19:18:28.519234 | orchestrator | Tuesday 23 September 2025 19:12:57 +0000 (0:00:02.124) 0:00:04.319 ***** 2025-09-23 19:18:28.519246 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.519257 | orchestrator | 2025-09-23 19:18:28.519267 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2025-09-23 19:18:28.519290 | orchestrator | Tuesday 23 September 2025 19:12:59 +0000 (0:00:01.688) 0:00:06.007 ***** 2025-09-23 19:18:28.519302 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.519314 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.519325 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.519336 | orchestrator | 2025-09-23 19:18:28.519347 | orchestrator | TASK [Setting sysctl values] *************************************************** 2025-09-23 19:18:28.519358 | orchestrator | Tuesday 23 September 2025 19:13:01 +0000 (0:00:02.198) 0:00:08.206 ***** 2025-09-23 19:18:28.519402 | orchestrator | included: sysctl for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.519414 | orchestrator | 2025-09-23 19:18:28.519425 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2025-09-23 19:18:28.519436 | orchestrator | Tuesday 23 September 2025 19:13:02 +0000 (0:00:01.472) 0:00:09.679 ***** 2025-09-23 19:18:28.519447 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.519458 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.519469 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.519479 | orchestrator | 2025-09-23 19:18:28.519491 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2025-09-23 19:18:28.519501 | orchestrator | Tuesday 23 September 2025 19:13:04 +0000 (0:00:01.959) 0:00:11.638 ***** 2025-09-23 19:18:28.519597 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-09-23 19:18:28.519622 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-09-23 19:18:28.519633 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-09-23 19:18:28.519644 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-09-23 19:18:28.519655 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-09-23 19:18:28.519666 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-09-23 19:18:28.519678 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-09-23 19:18:28.519689 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-09-23 19:18:28.519700 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-09-23 19:18:28.519711 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-09-23 19:18:28.519722 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-09-23 19:18:28.519733 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-09-23 19:18:28.519743 | orchestrator | 2025-09-23 19:18:28.519754 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-09-23 19:18:28.519765 | orchestrator | Tuesday 23 September 2025 19:13:08 +0000 (0:00:03.723) 0:00:15.362 ***** 2025-09-23 19:18:28.519776 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-09-23 19:18:28.519787 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-09-23 19:18:28.519803 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-09-23 19:18:28.519814 | orchestrator | 2025-09-23 19:18:28.519825 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-09-23 19:18:28.519848 | orchestrator | Tuesday 23 September 2025 19:13:09 +0000 (0:00:01.240) 0:00:16.603 ***** 2025-09-23 19:18:28.519859 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-09-23 19:18:28.519869 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-09-23 19:18:28.519879 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-09-23 19:18:28.519889 | orchestrator | 2025-09-23 19:18:28.519899 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-09-23 19:18:28.519909 | orchestrator | Tuesday 23 September 2025 19:13:11 +0000 (0:00:01.896) 0:00:18.499 ***** 2025-09-23 19:18:28.519931 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2025-09-23 19:18:28.519940 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.519951 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2025-09-23 19:18:28.519960 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.519970 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2025-09-23 19:18:28.519980 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.519989 | orchestrator | 2025-09-23 19:18:28.519999 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2025-09-23 19:18:28.520009 | orchestrator | Tuesday 23 September 2025 19:13:12 +0000 (0:00:00.458) 0:00:18.958 ***** 2025-09-23 19:18:28.520022 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520039 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520051 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520062 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520080 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520110 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520123 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.520136 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.520147 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.520159 | orchestrator | 2025-09-23 19:18:28.520171 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2025-09-23 19:18:28.520182 | orchestrator | Tuesday 23 September 2025 19:13:14 +0000 (0:00:02.306) 0:00:21.266 ***** 2025-09-23 19:18:28.520192 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.520202 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.520212 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.520222 | orchestrator | 2025-09-23 19:18:28.520233 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2025-09-23 19:18:28.520244 | orchestrator | Tuesday 23 September 2025 19:13:15 +0000 (0:00:01.488) 0:00:22.754 ***** 2025-09-23 19:18:28.520253 | orchestrator | changed: [testbed-node-1] => (item=users) 2025-09-23 19:18:28.520263 | orchestrator | changed: [testbed-node-0] => (item=users) 2025-09-23 19:18:28.520273 | orchestrator | changed: [testbed-node-1] => (item=rules) 2025-09-23 19:18:28.520283 | orchestrator | changed: [testbed-node-2] => (item=users) 2025-09-23 19:18:28.520293 | orchestrator | changed: [testbed-node-0] => (item=rules) 2025-09-23 19:18:28.520303 | orchestrator | changed: [testbed-node-2] => (item=rules) 2025-09-23 19:18:28.520313 | orchestrator | 2025-09-23 19:18:28.520324 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2025-09-23 19:18:28.520335 | orchestrator | Tuesday 23 September 2025 19:13:18 +0000 (0:00:02.502) 0:00:25.257 ***** 2025-09-23 19:18:28.520345 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.520355 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.520426 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.520439 | orchestrator | 2025-09-23 19:18:28.520450 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2025-09-23 19:18:28.520460 | orchestrator | Tuesday 23 September 2025 19:13:19 +0000 (0:00:01.251) 0:00:26.508 ***** 2025-09-23 19:18:28.520484 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.520494 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.520504 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.520514 | orchestrator | 2025-09-23 19:18:28.520524 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2025-09-23 19:18:28.520534 | orchestrator | Tuesday 23 September 2025 19:13:21 +0000 (0:00:01.958) 0:00:28.466 ***** 2025-09-23 19:18:28.520552 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.520575 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.520587 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.520598 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.520609 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-23 19:18:28.520620 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.520631 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.520650 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.520675 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-23 19:18:28.520686 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.520697 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.520708 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.520718 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.520728 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-23 19:18:28.520746 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.520756 | orchestrator | 2025-09-23 19:18:28.520766 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2025-09-23 19:18:28.520776 | orchestrator | Tuesday 23 September 2025 19:13:22 +0000 (0:00:01.418) 0:00:29.884 ***** 2025-09-23 19:18:28.520782 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520798 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520805 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520811 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520818 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.520824 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-23 19:18:28.520835 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520841 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.520856 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-23 19:18:28.520862 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520868 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.520873 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9', '__omit_place_holder__65cc5810e3763cdb05c99c1c9ed7a6ac5616bac9'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-23 19:18:28.520883 | orchestrator | 2025-09-23 19:18:28.520889 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2025-09-23 19:18:28.520894 | orchestrator | Tuesday 23 September 2025 19:13:26 +0000 (0:00:03.181) 0:00:33.066 ***** 2025-09-23 19:18:28.520900 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.520906 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.522282 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.522360 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.522403 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.522416 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.522451 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.522464 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.522475 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.522487 | orchestrator | 2025-09-23 19:18:28.522500 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2025-09-23 19:18:28.522512 | orchestrator | Tuesday 23 September 2025 19:13:29 +0000 (0:00:03.181) 0:00:36.248 ***** 2025-09-23 19:18:28.522523 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-09-23 19:18:28.522541 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-09-23 19:18:28.522552 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-09-23 19:18:28.522563 | orchestrator | 2025-09-23 19:18:28.522590 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2025-09-23 19:18:28.522602 | orchestrator | Tuesday 23 September 2025 19:13:31 +0000 (0:00:01.975) 0:00:38.223 ***** 2025-09-23 19:18:28.522613 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-09-23 19:18:28.522624 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-09-23 19:18:28.522634 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-09-23 19:18:28.522645 | orchestrator | 2025-09-23 19:18:28.522656 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2025-09-23 19:18:28.522666 | orchestrator | Tuesday 23 September 2025 19:13:34 +0000 (0:00:03.297) 0:00:41.520 ***** 2025-09-23 19:18:28.522677 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.522688 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.522699 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.522709 | orchestrator | 2025-09-23 19:18:28.522720 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2025-09-23 19:18:28.522730 | orchestrator | Tuesday 23 September 2025 19:13:35 +0000 (0:00:00.487) 0:00:42.008 ***** 2025-09-23 19:18:28.522741 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-09-23 19:18:28.522753 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-09-23 19:18:28.522771 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-09-23 19:18:28.522781 | orchestrator | 2025-09-23 19:18:28.522792 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2025-09-23 19:18:28.522803 | orchestrator | Tuesday 23 September 2025 19:13:37 +0000 (0:00:02.262) 0:00:44.270 ***** 2025-09-23 19:18:28.522813 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-09-23 19:18:28.522824 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-09-23 19:18:28.522835 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-09-23 19:18:28.522845 | orchestrator | 2025-09-23 19:18:28.522856 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2025-09-23 19:18:28.522867 | orchestrator | Tuesday 23 September 2025 19:13:39 +0000 (0:00:02.506) 0:00:46.777 ***** 2025-09-23 19:18:28.522878 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2025-09-23 19:18:28.522889 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2025-09-23 19:18:28.522899 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2025-09-23 19:18:28.522910 | orchestrator | 2025-09-23 19:18:28.522921 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2025-09-23 19:18:28.522931 | orchestrator | Tuesday 23 September 2025 19:13:41 +0000 (0:00:01.323) 0:00:48.100 ***** 2025-09-23 19:18:28.522942 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2025-09-23 19:18:28.522953 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2025-09-23 19:18:28.522963 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2025-09-23 19:18:28.522974 | orchestrator | 2025-09-23 19:18:28.522984 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-09-23 19:18:28.522995 | orchestrator | Tuesday 23 September 2025 19:13:42 +0000 (0:00:01.415) 0:00:49.515 ***** 2025-09-23 19:18:28.523006 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.523016 | orchestrator | 2025-09-23 19:18:28.523027 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over extra CA certificates] *** 2025-09-23 19:18:28.523037 | orchestrator | Tuesday 23 September 2025 19:13:43 +0000 (0:00:00.515) 0:00:50.031 ***** 2025-09-23 19:18:28.523049 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.523073 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.523086 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.523104 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.523116 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.523127 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.523138 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.523150 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.523172 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.523190 | orchestrator | 2025-09-23 19:18:28.523201 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS certificate] *** 2025-09-23 19:18:28.523212 | orchestrator | Tuesday 23 September 2025 19:13:47 +0000 (0:00:04.015) 0:00:54.046 ***** 2025-09-23 19:18:28.523223 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523234 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523245 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523256 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.523268 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523279 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523290 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523313 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.523336 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523348 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523360 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523390 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.523401 | orchestrator | 2025-09-23 19:18:28.523412 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS key] *** 2025-09-23 19:18:28.523422 | orchestrator | Tuesday 23 September 2025 19:13:48 +0000 (0:00:01.497) 0:00:55.544 ***** 2025-09-23 19:18:28.523434 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523445 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523457 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523474 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.523496 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523508 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523520 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523531 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.523542 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523553 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523564 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523575 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.523586 | orchestrator | 2025-09-23 19:18:28.523597 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2025-09-23 19:18:28.523614 | orchestrator | Tuesday 23 September 2025 19:13:51 +0000 (0:00:02.755) 0:00:58.300 ***** 2025-09-23 19:18:28.523630 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523660 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523672 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523683 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.523694 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523705 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523716 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523727 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.523738 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523766 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523778 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523790 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.523801 | orchestrator | 2025-09-23 19:18:28.523811 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2025-09-23 19:18:28.523822 | orchestrator | Tuesday 23 September 2025 19:13:52 +0000 (0:00:00.820) 0:00:59.121 ***** 2025-09-23 19:18:28.523833 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523845 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523856 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523867 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.523878 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523896 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523915 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523926 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.523937 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.523948 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.523960 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.523971 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.523982 | orchestrator | 2025-09-23 19:18:28.523993 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2025-09-23 19:18:28.524004 | orchestrator | Tuesday 23 September 2025 19:13:53 +0000 (0:00:01.036) 0:01:00.158 ***** 2025-09-23 19:18:28.524015 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524060 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524077 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524089 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.524107 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524119 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524130 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524141 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.524152 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524170 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524181 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524192 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.524203 | orchestrator | 2025-09-23 19:18:28.524214 | orchestrator | TASK [service-cert-copy : proxysql | Copying over extra CA certificates] ******* 2025-09-23 19:18:28.524225 | orchestrator | Tuesday 23 September 2025 19:13:54 +0000 (0:00:00.900) 0:01:01.058 ***** 2025-09-23 19:18:28.524248 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524261 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524272 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524283 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524300 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524312 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524323 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.524334 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.524353 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524387 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524400 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524411 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.524422 | orchestrator | 2025-09-23 19:18:28.524433 | orchestrator | TASK [service-cert-copy : proxysql | Copying over backend internal TLS certificate] *** 2025-09-23 19:18:28.524443 | orchestrator | Tuesday 23 September 2025 19:13:55 +0000 (0:00:01.221) 0:01:02.280 ***** 2025-09-23 19:18:28.524454 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524472 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524483 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524494 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.524505 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524527 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524539 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524550 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.524562 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524579 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524590 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524601 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.524612 | orchestrator | 2025-09-23 19:18:28.524623 | orchestrator | TASK [service-cert-copy : proxysql | Copying over backend internal TLS key] **** 2025-09-23 19:18:28.524633 | orchestrator | Tuesday 23 September 2025 19:13:56 +0000 (0:00:00.780) 0:01:03.060 ***** 2025-09-23 19:18:28.524644 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524661 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524680 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524691 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.524703 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524720 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524731 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524742 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.524753 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-23 19:18:28.524764 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-23 19:18:28.524780 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-23 19:18:28.524792 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.524803 | orchestrator | 2025-09-23 19:18:28.524819 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2025-09-23 19:18:28.524830 | orchestrator | Tuesday 23 September 2025 19:13:56 +0000 (0:00:00.731) 0:01:03.792 ***** 2025-09-23 19:18:28.524841 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-09-23 19:18:28.524852 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-09-23 19:18:28.524863 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-09-23 19:18:28.524879 | orchestrator | 2025-09-23 19:18:28.524890 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2025-09-23 19:18:28.524901 | orchestrator | Tuesday 23 September 2025 19:13:59 +0000 (0:00:02.174) 0:01:05.966 ***** 2025-09-23 19:18:28.524911 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-09-23 19:18:28.524922 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-09-23 19:18:28.524932 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-09-23 19:18:28.524943 | orchestrator | 2025-09-23 19:18:28.524954 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2025-09-23 19:18:28.524964 | orchestrator | Tuesday 23 September 2025 19:14:00 +0000 (0:00:01.950) 0:01:07.917 ***** 2025-09-23 19:18:28.524975 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-09-23 19:18:28.524986 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-09-23 19:18:28.524996 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.525007 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-09-23 19:18:28.525017 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-09-23 19:18:28.525028 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-09-23 19:18:28.525038 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.525049 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-09-23 19:18:28.525060 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.525070 | orchestrator | 2025-09-23 19:18:28.525081 | orchestrator | TASK [loadbalancer : Check loadbalancer containers] **************************** 2025-09-23 19:18:28.525092 | orchestrator | Tuesday 23 September 2025 19:14:02 +0000 (0:00:01.378) 0:01:09.295 ***** 2025-09-23 19:18:28.525103 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.525115 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.525131 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-23 19:18:28.525157 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.525169 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.525180 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-23 19:18:28.525192 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.525203 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.525214 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-23 19:18:28.525225 | orchestrator | 2025-09-23 19:18:28.525236 | orchestrator | TASK [include_role : aodh] ***************************************************** 2025-09-23 19:18:28.525247 | orchestrator | Tuesday 23 September 2025 19:14:04 +0000 (0:00:02.477) 0:01:11.772 ***** 2025-09-23 19:18:28.525258 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.525268 | orchestrator | 2025-09-23 19:18:28.525279 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2025-09-23 19:18:28.525296 | orchestrator | Tuesday 23 September 2025 19:14:05 +0000 (0:00:00.786) 0:01:12.558 ***** 2025-09-23 19:18:28.526101 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-09-23 19:18:28.526140 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.526148 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526154 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526159 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-09-23 19:18:28.526165 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.526190 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526196 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526202 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-09-23 19:18:28.526208 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.526213 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526219 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526233 | orchestrator | 2025-09-23 19:18:28.526239 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2025-09-23 19:18:28.526245 | orchestrator | Tuesday 23 September 2025 19:14:09 +0000 (0:00:04.313) 0:01:16.872 ***** 2025-09-23 19:18:28.526259 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-09-23 19:18:28.526265 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.526270 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526276 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526282 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.526288 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-09-23 19:18:28.526294 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.526307 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526317 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526323 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.526329 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-09-23 19:18:28.526334 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.526340 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526346 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526356 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.526361 | orchestrator | 2025-09-23 19:18:28.526386 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2025-09-23 19:18:28.526395 | orchestrator | Tuesday 23 September 2025 19:14:10 +0000 (0:00:00.990) 0:01:17.862 ***** 2025-09-23 19:18:28.526404 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-09-23 19:18:28.526415 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-09-23 19:18:28.526426 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.526431 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-09-23 19:18:28.526440 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-09-23 19:18:28.526446 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.526455 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-09-23 19:18:28.526461 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-09-23 19:18:28.526467 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.526472 | orchestrator | 2025-09-23 19:18:28.526478 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2025-09-23 19:18:28.526483 | orchestrator | Tuesday 23 September 2025 19:14:12 +0000 (0:00:01.225) 0:01:19.087 ***** 2025-09-23 19:18:28.526489 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.526494 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.526499 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.526504 | orchestrator | 2025-09-23 19:18:28.526510 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2025-09-23 19:18:28.526515 | orchestrator | Tuesday 23 September 2025 19:14:13 +0000 (0:00:01.202) 0:01:20.290 ***** 2025-09-23 19:18:28.526520 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.526526 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.526531 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.526536 | orchestrator | 2025-09-23 19:18:28.526542 | orchestrator | TASK [include_role : barbican] ************************************************* 2025-09-23 19:18:28.526547 | orchestrator | Tuesday 23 September 2025 19:14:15 +0000 (0:00:02.016) 0:01:22.306 ***** 2025-09-23 19:18:28.526553 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.526558 | orchestrator | 2025-09-23 19:18:28.526563 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2025-09-23 19:18:28.526568 | orchestrator | Tuesday 23 September 2025 19:14:16 +0000 (0:00:00.860) 0:01:23.167 ***** 2025-09-23 19:18:28.526575 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.526592 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526602 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526620 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.526630 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526640 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526659 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.526669 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526678 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526687 | orchestrator | 2025-09-23 19:18:28.526697 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2025-09-23 19:18:28.526706 | orchestrator | Tuesday 23 September 2025 19:14:19 +0000 (0:00:03.435) 0:01:26.602 ***** 2025-09-23 19:18:28.526716 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.526722 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526732 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526738 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.526744 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.526749 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526762 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526767 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.526773 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.526779 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526788 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.526794 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.526799 | orchestrator | 2025-09-23 19:18:28.526805 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2025-09-23 19:18:28.526810 | orchestrator | Tuesday 23 September 2025 19:14:20 +0000 (0:00:00.649) 0:01:27.252 ***** 2025-09-23 19:18:28.526816 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-23 19:18:28.526822 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-23 19:18:28.526828 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.526834 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-23 19:18:28.526839 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-23 19:18:28.526845 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.526850 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-23 19:18:28.526858 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-23 19:18:28.526864 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.526869 | orchestrator | 2025-09-23 19:18:28.526875 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2025-09-23 19:18:28.526883 | orchestrator | Tuesday 23 September 2025 19:14:21 +0000 (0:00:01.138) 0:01:28.390 ***** 2025-09-23 19:18:28.526889 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.526894 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.526899 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.526905 | orchestrator | 2025-09-23 19:18:28.526910 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2025-09-23 19:18:28.526916 | orchestrator | Tuesday 23 September 2025 19:14:22 +0000 (0:00:01.214) 0:01:29.604 ***** 2025-09-23 19:18:28.526921 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.526926 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.526936 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.526941 | orchestrator | 2025-09-23 19:18:28.526947 | orchestrator | TASK [include_role : blazar] *************************************************** 2025-09-23 19:18:28.526953 | orchestrator | Tuesday 23 September 2025 19:14:24 +0000 (0:00:02.040) 0:01:31.645 ***** 2025-09-23 19:18:28.526959 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.526964 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.526970 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.526976 | orchestrator | 2025-09-23 19:18:28.526981 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2025-09-23 19:18:28.526987 | orchestrator | Tuesday 23 September 2025 19:14:25 +0000 (0:00:00.326) 0:01:31.972 ***** 2025-09-23 19:18:28.526993 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.526998 | orchestrator | 2025-09-23 19:18:28.527004 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2025-09-23 19:18:28.527010 | orchestrator | Tuesday 23 September 2025 19:14:25 +0000 (0:00:00.697) 0:01:32.669 ***** 2025-09-23 19:18:28.527016 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-09-23 19:18:28.527023 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-09-23 19:18:28.527030 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-09-23 19:18:28.527036 | orchestrator | 2025-09-23 19:18:28.527041 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2025-09-23 19:18:28.527047 | orchestrator | Tuesday 23 September 2025 19:14:29 +0000 (0:00:03.304) 0:01:35.974 ***** 2025-09-23 19:18:28.527063 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-09-23 19:18:28.527073 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527079 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-09-23 19:18:28.527085 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527091 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-09-23 19:18:28.527097 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527102 | orchestrator | 2025-09-23 19:18:28.527108 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2025-09-23 19:18:28.527114 | orchestrator | Tuesday 23 September 2025 19:14:30 +0000 (0:00:01.365) 0:01:37.339 ***** 2025-09-23 19:18:28.527120 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-23 19:18:28.527127 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-23 19:18:28.527133 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527139 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-23 19:18:28.527170 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-23 19:18:28.527177 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527182 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-23 19:18:28.527189 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-23 19:18:28.527195 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527200 | orchestrator | 2025-09-23 19:18:28.527206 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2025-09-23 19:18:28.527212 | orchestrator | Tuesday 23 September 2025 19:14:32 +0000 (0:00:01.644) 0:01:38.984 ***** 2025-09-23 19:18:28.527217 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527223 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527229 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527234 | orchestrator | 2025-09-23 19:18:28.527240 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2025-09-23 19:18:28.527246 | orchestrator | Tuesday 23 September 2025 19:14:32 +0000 (0:00:00.608) 0:01:39.592 ***** 2025-09-23 19:18:28.527252 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527257 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527263 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527268 | orchestrator | 2025-09-23 19:18:28.527274 | orchestrator | TASK [include_role : cinder] *************************************************** 2025-09-23 19:18:28.527280 | orchestrator | Tuesday 23 September 2025 19:14:33 +0000 (0:00:01.019) 0:01:40.611 ***** 2025-09-23 19:18:28.527286 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.527291 | orchestrator | 2025-09-23 19:18:28.527297 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2025-09-23 19:18:28.527303 | orchestrator | Tuesday 23 September 2025 19:14:34 +0000 (0:00:00.654) 0:01:41.266 ***** 2025-09-23 19:18:28.527309 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.527321 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527334 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527341 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527348 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.527354 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527360 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527393 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527403 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.527410 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527416 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527422 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527434 | orchestrator | 2025-09-23 19:18:28.527440 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2025-09-23 19:18:28.527446 | orchestrator | Tuesday 23 September 2025 19:14:38 +0000 (0:00:04.095) 0:01:45.361 ***** 2025-09-23 19:18:28.527455 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.527465 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527471 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.527477 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527483 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527493 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527499 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527512 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527519 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527525 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527531 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.527537 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527547 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527556 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527562 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527568 | orchestrator | 2025-09-23 19:18:28.527577 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2025-09-23 19:18:28.527583 | orchestrator | Tuesday 23 September 2025 19:14:39 +0000 (0:00:00.918) 0:01:46.280 ***** 2025-09-23 19:18:28.527590 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-23 19:18:28.527596 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-23 19:18:28.527602 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527608 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-23 19:18:28.527614 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-23 19:18:28.527620 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-23 19:18:28.527626 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527632 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-23 19:18:28.527638 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527644 | orchestrator | 2025-09-23 19:18:28.527653 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2025-09-23 19:18:28.527659 | orchestrator | Tuesday 23 September 2025 19:14:40 +0000 (0:00:00.812) 0:01:47.093 ***** 2025-09-23 19:18:28.527665 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.527670 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.527676 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.527682 | orchestrator | 2025-09-23 19:18:28.527688 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2025-09-23 19:18:28.527695 | orchestrator | Tuesday 23 September 2025 19:14:41 +0000 (0:00:01.300) 0:01:48.393 ***** 2025-09-23 19:18:28.527705 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.527714 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.527724 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.527733 | orchestrator | 2025-09-23 19:18:28.527742 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2025-09-23 19:18:28.527752 | orchestrator | Tuesday 23 September 2025 19:14:43 +0000 (0:00:01.803) 0:01:50.196 ***** 2025-09-23 19:18:28.527762 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527772 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527781 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527791 | orchestrator | 2025-09-23 19:18:28.527801 | orchestrator | TASK [include_role : cyborg] *************************************************** 2025-09-23 19:18:28.527811 | orchestrator | Tuesday 23 September 2025 19:14:43 +0000 (0:00:00.405) 0:01:50.602 ***** 2025-09-23 19:18:28.527818 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.527824 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.527830 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.527835 | orchestrator | 2025-09-23 19:18:28.527841 | orchestrator | TASK [include_role : designate] ************************************************ 2025-09-23 19:18:28.527847 | orchestrator | Tuesday 23 September 2025 19:14:43 +0000 (0:00:00.280) 0:01:50.883 ***** 2025-09-23 19:18:28.527852 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.527858 | orchestrator | 2025-09-23 19:18:28.527864 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2025-09-23 19:18:28.527869 | orchestrator | Tuesday 23 September 2025 19:14:44 +0000 (0:00:00.729) 0:01:51.613 ***** 2025-09-23 19:18:28.527881 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-09-23 19:18:28.527893 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-23 19:18:28.527900 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527911 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527917 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527923 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527929 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527941 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-09-23 19:18:28.527947 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-23 19:18:28.527957 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527963 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527969 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527975 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527981 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.527993 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-09-23 19:18:28.528003 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-23 19:18:28.528010 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528016 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528022 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528028 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528040 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528049 | orchestrator | 2025-09-23 19:18:28.528055 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2025-09-23 19:18:28.528061 | orchestrator | Tuesday 23 September 2025 19:14:48 +0000 (0:00:04.136) 0:01:55.750 ***** 2025-09-23 19:18:28.528067 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-09-23 19:18:28.528073 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-23 19:18:28.528079 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528085 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528091 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528103 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528114 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528120 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.528126 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-09-23 19:18:28.528132 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-23 19:18:28.528138 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528144 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528153 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528167 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528173 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528179 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.528185 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-09-23 19:18:28.528191 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-23 19:18:28.528197 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528203 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528219 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528225 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528231 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.528237 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.528243 | orchestrator | 2025-09-23 19:18:28.528249 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2025-09-23 19:18:28.528255 | orchestrator | Tuesday 23 September 2025 19:14:49 +0000 (0:00:00.890) 0:01:56.641 ***** 2025-09-23 19:18:28.528261 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-09-23 19:18:28.528267 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-09-23 19:18:28.528273 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.528279 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-09-23 19:18:28.528284 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-09-23 19:18:28.528290 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.528296 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-09-23 19:18:28.528302 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-09-23 19:18:28.528307 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.528313 | orchestrator | 2025-09-23 19:18:28.528319 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2025-09-23 19:18:28.528329 | orchestrator | Tuesday 23 September 2025 19:14:50 +0000 (0:00:01.090) 0:01:57.731 ***** 2025-09-23 19:18:28.528335 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.528341 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.528346 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.528352 | orchestrator | 2025-09-23 19:18:28.528358 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2025-09-23 19:18:28.528380 | orchestrator | Tuesday 23 September 2025 19:14:52 +0000 (0:00:01.309) 0:01:59.041 ***** 2025-09-23 19:18:28.528386 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.528392 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.528398 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.528403 | orchestrator | 2025-09-23 19:18:28.528409 | orchestrator | TASK [include_role : etcd] ***************************************************** 2025-09-23 19:18:28.528415 | orchestrator | Tuesday 23 September 2025 19:14:54 +0000 (0:00:02.125) 0:02:01.167 ***** 2025-09-23 19:18:28.528420 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.528426 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.528432 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.528437 | orchestrator | 2025-09-23 19:18:28.528449 | orchestrator | TASK [include_role : glance] *************************************************** 2025-09-23 19:18:28.528455 | orchestrator | Tuesday 23 September 2025 19:14:54 +0000 (0:00:00.531) 0:02:01.698 ***** 2025-09-23 19:18:28.528460 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.528466 | orchestrator | 2025-09-23 19:18:28.528476 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2025-09-23 19:18:28.528482 | orchestrator | Tuesday 23 September 2025 19:14:55 +0000 (0:00:00.830) 0:02:02.528 ***** 2025-09-23 19:18:28.528489 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-09-23 19:18:28.528497 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.528517 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-09-23 19:18:28.528524 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.528542 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-09-23 19:18:28.528549 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.528559 | orchestrator | 2025-09-23 19:18:28.528565 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2025-09-23 19:18:28.528571 | orchestrator | Tuesday 23 September 2025 19:15:00 +0000 (0:00:04.559) 0:02:07.088 ***** 2025-09-23 19:18:28.528584 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-09-23 19:18:28.528591 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.528601 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.528613 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-09-23 19:18:28.528620 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.528630 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.528637 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-09-23 19:18:28.529605 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.529649 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.529660 | orchestrator | 2025-09-23 19:18:28.529670 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2025-09-23 19:18:28.529680 | orchestrator | Tuesday 23 September 2025 19:15:03 +0000 (0:00:03.838) 0:02:10.927 ***** 2025-09-23 19:18:28.529691 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-23 19:18:28.529714 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-23 19:18:28.529724 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.529734 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-23 19:18:28.529744 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-23 19:18:28.529754 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.529774 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-23 19:18:28.529784 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-23 19:18:28.529794 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.529804 | orchestrator | 2025-09-23 19:18:28.529813 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2025-09-23 19:18:28.529823 | orchestrator | Tuesday 23 September 2025 19:15:07 +0000 (0:00:03.828) 0:02:14.755 ***** 2025-09-23 19:18:28.529832 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.529842 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.529851 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.529861 | orchestrator | 2025-09-23 19:18:28.529870 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2025-09-23 19:18:28.529897 | orchestrator | Tuesday 23 September 2025 19:15:09 +0000 (0:00:01.274) 0:02:16.030 ***** 2025-09-23 19:18:28.529914 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.529923 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.529933 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.529942 | orchestrator | 2025-09-23 19:18:28.529952 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2025-09-23 19:18:28.529961 | orchestrator | Tuesday 23 September 2025 19:15:11 +0000 (0:00:01.978) 0:02:18.008 ***** 2025-09-23 19:18:28.529971 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.529980 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.529990 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.529999 | orchestrator | 2025-09-23 19:18:28.530009 | orchestrator | TASK [include_role : grafana] ************************************************** 2025-09-23 19:18:28.530060 | orchestrator | Tuesday 23 September 2025 19:15:11 +0000 (0:00:00.392) 0:02:18.401 ***** 2025-09-23 19:18:28.530072 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.530082 | orchestrator | 2025-09-23 19:18:28.530092 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2025-09-23 19:18:28.530102 | orchestrator | Tuesday 23 September 2025 19:15:12 +0000 (0:00:00.763) 0:02:19.164 ***** 2025-09-23 19:18:28.530112 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:18:28.530123 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:18:28.530138 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:18:28.530148 | orchestrator | 2025-09-23 19:18:28.530164 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2025-09-23 19:18:28.530176 | orchestrator | Tuesday 23 September 2025 19:15:15 +0000 (0:00:03.669) 0:02:22.834 ***** 2025-09-23 19:18:28.530187 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:18:28.530204 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:18:28.530216 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.530226 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.530237 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:18:28.530247 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.530258 | orchestrator | 2025-09-23 19:18:28.530269 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2025-09-23 19:18:28.530280 | orchestrator | Tuesday 23 September 2025 19:15:16 +0000 (0:00:00.743) 0:02:23.578 ***** 2025-09-23 19:18:28.530292 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-09-23 19:18:28.530303 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-09-23 19:18:28.530315 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.530326 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-09-23 19:18:28.530337 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-09-23 19:18:28.530347 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.530358 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-09-23 19:18:28.530424 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-09-23 19:18:28.530436 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.530447 | orchestrator | 2025-09-23 19:18:28.530458 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2025-09-23 19:18:28.530472 | orchestrator | Tuesday 23 September 2025 19:15:17 +0000 (0:00:00.669) 0:02:24.247 ***** 2025-09-23 19:18:28.530483 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.530494 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.530512 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.530522 | orchestrator | 2025-09-23 19:18:28.530544 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2025-09-23 19:18:28.530554 | orchestrator | Tuesday 23 September 2025 19:15:18 +0000 (0:00:01.358) 0:02:25.606 ***** 2025-09-23 19:18:28.530564 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.530573 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.530583 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.530592 | orchestrator | 2025-09-23 19:18:28.530602 | orchestrator | TASK [include_role : heat] ***************************************************** 2025-09-23 19:18:28.530611 | orchestrator | Tuesday 23 September 2025 19:15:20 +0000 (0:00:01.797) 0:02:27.404 ***** 2025-09-23 19:18:28.530621 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.530630 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.530639 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.530649 | orchestrator | 2025-09-23 19:18:28.530658 | orchestrator | TASK [include_role : horizon] ************************************************** 2025-09-23 19:18:28.530667 | orchestrator | Tuesday 23 September 2025 19:15:20 +0000 (0:00:00.388) 0:02:27.792 ***** 2025-09-23 19:18:28.530677 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.530686 | orchestrator | 2025-09-23 19:18:28.530696 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2025-09-23 19:18:28.530705 | orchestrator | Tuesday 23 September 2025 19:15:21 +0000 (0:00:00.806) 0:02:28.598 ***** 2025-09-23 19:18:28.530717 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:18:28.530742 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:18:28.530760 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:18:28.530771 | orchestrator | 2025-09-23 19:18:28.530781 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2025-09-23 19:18:28.530795 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:04.403) 0:02:33.002 ***** 2025-09-23 19:18:28.530818 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:18:28.530830 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.530840 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:18:28.530857 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.530879 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:18:28.530890 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.530900 | orchestrator | 2025-09-23 19:18:28.530909 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2025-09-23 19:18:28.530919 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:00.879) 0:02:33.881 ***** 2025-09-23 19:18:28.530930 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-23 19:18:28.530941 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-23 19:18:28.530950 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-23 19:18:28.530966 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-23 19:18:28.530975 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-23 19:18:28.530986 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-23 19:18:28.530998 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-09-23 19:18:28.531006 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.531014 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-23 19:18:28.531022 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-23 19:18:28.531030 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-23 19:18:28.531038 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-09-23 19:18:28.531046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-23 19:18:28.531054 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.531062 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-23 19:18:28.531070 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-23 19:18:28.531078 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-09-23 19:18:28.531091 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.531098 | orchestrator | 2025-09-23 19:18:28.531106 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2025-09-23 19:18:28.531114 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:01.052) 0:02:34.933 ***** 2025-09-23 19:18:28.531122 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.531129 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.531137 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.531145 | orchestrator | 2025-09-23 19:18:28.531152 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2025-09-23 19:18:28.531160 | orchestrator | Tuesday 23 September 2025 19:15:29 +0000 (0:00:01.316) 0:02:36.250 ***** 2025-09-23 19:18:28.531168 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.531176 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.531183 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.531191 | orchestrator | 2025-09-23 19:18:28.531199 | orchestrator | TASK [include_role : influxdb] ************************************************* 2025-09-23 19:18:28.531206 | orchestrator | Tuesday 23 September 2025 19:15:31 +0000 (0:00:02.187) 0:02:38.437 ***** 2025-09-23 19:18:28.531214 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.531222 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.531230 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.531237 | orchestrator | 2025-09-23 19:18:28.531245 | orchestrator | TASK [include_role : ironic] *************************************************** 2025-09-23 19:18:28.531253 | orchestrator | Tuesday 23 September 2025 19:15:31 +0000 (0:00:00.349) 0:02:38.787 ***** 2025-09-23 19:18:28.531260 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.531268 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.531276 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.531283 | orchestrator | 2025-09-23 19:18:28.531291 | orchestrator | TASK [include_role : keystone] ************************************************* 2025-09-23 19:18:28.531299 | orchestrator | Tuesday 23 September 2025 19:15:32 +0000 (0:00:00.548) 0:02:39.335 ***** 2025-09-23 19:18:28.531306 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.531314 | orchestrator | 2025-09-23 19:18:28.531325 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2025-09-23 19:18:28.531333 | orchestrator | Tuesday 23 September 2025 19:15:33 +0000 (0:00:00.988) 0:02:40.323 ***** 2025-09-23 19:18:28.531347 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:18:28.531356 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:18:28.531385 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:18:28.531395 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:18:28.531403 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:18:28.531419 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:18:28.531428 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:18:28.531437 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:18:28.531449 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:18:28.531457 | orchestrator | 2025-09-23 19:18:28.531465 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2025-09-23 19:18:28.531473 | orchestrator | Tuesday 23 September 2025 19:15:37 +0000 (0:00:04.154) 0:02:44.478 ***** 2025-09-23 19:18:28.531482 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:18:28.531497 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:18:28.531506 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:18:28.531514 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.531522 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:18:28.531535 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:18:28.531543 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:18:28.531551 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.531567 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:18:28.531576 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:18:28.531584 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:18:28.531596 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.531604 | orchestrator | 2025-09-23 19:18:28.531612 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2025-09-23 19:18:28.531620 | orchestrator | Tuesday 23 September 2025 19:15:38 +0000 (0:00:01.131) 0:02:45.609 ***** 2025-09-23 19:18:28.531628 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-23 19:18:28.531637 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-23 19:18:28.531645 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.531653 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-23 19:18:28.531661 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-23 19:18:28.531669 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.531677 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-23 19:18:28.531685 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-23 19:18:28.531693 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.531701 | orchestrator | 2025-09-23 19:18:28.531709 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2025-09-23 19:18:28.531717 | orchestrator | Tuesday 23 September 2025 19:15:39 +0000 (0:00:01.157) 0:02:46.767 ***** 2025-09-23 19:18:28.531725 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.531733 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.531741 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.531748 | orchestrator | 2025-09-23 19:18:28.531756 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2025-09-23 19:18:28.531764 | orchestrator | Tuesday 23 September 2025 19:15:41 +0000 (0:00:01.445) 0:02:48.212 ***** 2025-09-23 19:18:28.531772 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.531779 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.531787 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.531795 | orchestrator | 2025-09-23 19:18:28.531803 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2025-09-23 19:18:28.531814 | orchestrator | Tuesday 23 September 2025 19:15:43 +0000 (0:00:01.898) 0:02:50.111 ***** 2025-09-23 19:18:28.531822 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.531830 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.531838 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.531845 | orchestrator | 2025-09-23 19:18:28.531861 | orchestrator | TASK [include_role : magnum] *************************************************** 2025-09-23 19:18:28.531870 | orchestrator | Tuesday 23 September 2025 19:15:43 +0000 (0:00:00.515) 0:02:50.627 ***** 2025-09-23 19:18:28.531878 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.531885 | orchestrator | 2025-09-23 19:18:28.531893 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2025-09-23 19:18:28.531901 | orchestrator | Tuesday 23 September 2025 19:15:45 +0000 (0:00:01.484) 0:02:52.112 ***** 2025-09-23 19:18:28.531909 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-09-23 19:18:28.531918 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.531927 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-09-23 19:18:28.531936 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.531953 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-09-23 19:18:28.531967 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.531976 | orchestrator | 2025-09-23 19:18:28.531983 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2025-09-23 19:18:28.531991 | orchestrator | Tuesday 23 September 2025 19:15:49 +0000 (0:00:04.728) 0:02:56.840 ***** 2025-09-23 19:18:28.532000 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-09-23 19:18:28.532008 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532016 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.532030 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-09-23 19:18:28.532047 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532055 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.532063 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-09-23 19:18:28.532072 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532080 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.532087 | orchestrator | 2025-09-23 19:18:28.532095 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2025-09-23 19:18:28.532103 | orchestrator | Tuesday 23 September 2025 19:15:50 +0000 (0:00:00.887) 0:02:57.728 ***** 2025-09-23 19:18:28.532111 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-09-23 19:18:28.532119 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-09-23 19:18:28.532127 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.532136 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-09-23 19:18:28.532144 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-09-23 19:18:28.532156 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.532164 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-09-23 19:18:28.532173 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-09-23 19:18:28.532181 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.532188 | orchestrator | 2025-09-23 19:18:28.532196 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2025-09-23 19:18:28.532207 | orchestrator | Tuesday 23 September 2025 19:15:51 +0000 (0:00:00.921) 0:02:58.649 ***** 2025-09-23 19:18:28.532215 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.532223 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.532231 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.532238 | orchestrator | 2025-09-23 19:18:28.532250 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2025-09-23 19:18:28.532258 | orchestrator | Tuesday 23 September 2025 19:15:52 +0000 (0:00:01.113) 0:02:59.763 ***** 2025-09-23 19:18:28.532265 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.532273 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.532281 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.532288 | orchestrator | 2025-09-23 19:18:28.532296 | orchestrator | TASK [include_role : manila] *************************************************** 2025-09-23 19:18:28.532304 | orchestrator | Tuesday 23 September 2025 19:15:54 +0000 (0:00:01.684) 0:03:01.447 ***** 2025-09-23 19:18:28.532312 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.532319 | orchestrator | 2025-09-23 19:18:28.532327 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2025-09-23 19:18:28.532335 | orchestrator | Tuesday 23 September 2025 19:15:55 +0000 (0:00:01.104) 0:03:02.552 ***** 2025-09-23 19:18:28.532343 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-09-23 19:18:28.532351 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532360 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532387 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532403 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-09-23 19:18:28.532412 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532420 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532429 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532437 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-09-23 19:18:28.532450 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532462 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532475 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532483 | orchestrator | 2025-09-23 19:18:28.532491 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2025-09-23 19:18:28.532499 | orchestrator | Tuesday 23 September 2025 19:15:59 +0000 (0:00:03.600) 0:03:06.152 ***** 2025-09-23 19:18:28.532507 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-09-23 19:18:28.532515 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532528 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532536 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532544 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.532560 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-09-23 19:18:28.532569 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532577 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532585 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532598 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.532606 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-09-23 19:18:28.532614 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532626 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532739 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.532751 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.532760 | orchestrator | 2025-09-23 19:18:28.532768 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2025-09-23 19:18:28.532775 | orchestrator | Tuesday 23 September 2025 19:15:59 +0000 (0:00:00.599) 0:03:06.752 ***** 2025-09-23 19:18:28.532783 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-09-23 19:18:28.532791 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-09-23 19:18:28.532799 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.532807 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-09-23 19:18:28.532820 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-09-23 19:18:28.532828 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.532835 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-09-23 19:18:28.532843 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-09-23 19:18:28.532851 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.532859 | orchestrator | 2025-09-23 19:18:28.532867 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2025-09-23 19:18:28.532875 | orchestrator | Tuesday 23 September 2025 19:16:01 +0000 (0:00:01.217) 0:03:07.970 ***** 2025-09-23 19:18:28.532883 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.532890 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.532898 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.532905 | orchestrator | 2025-09-23 19:18:28.532913 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2025-09-23 19:18:28.532921 | orchestrator | Tuesday 23 September 2025 19:16:02 +0000 (0:00:01.256) 0:03:09.226 ***** 2025-09-23 19:18:28.532928 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.532936 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.532944 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.532951 | orchestrator | 2025-09-23 19:18:28.532959 | orchestrator | TASK [include_role : mariadb] ************************************************** 2025-09-23 19:18:28.532967 | orchestrator | Tuesday 23 September 2025 19:16:04 +0000 (0:00:01.917) 0:03:11.144 ***** 2025-09-23 19:18:28.532975 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.532982 | orchestrator | 2025-09-23 19:18:28.532990 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2025-09-23 19:18:28.532998 | orchestrator | Tuesday 23 September 2025 19:16:05 +0000 (0:00:01.363) 0:03:12.507 ***** 2025-09-23 19:18:28.533005 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-23 19:18:28.533013 | orchestrator | 2025-09-23 19:18:28.533021 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2025-09-23 19:18:28.533029 | orchestrator | Tuesday 23 September 2025 19:16:07 +0000 (0:00:01.457) 0:03:13.964 ***** 2025-09-23 19:18:28.533047 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:18:28.533064 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-23 19:18:28.533072 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533080 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:18:28.533095 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-23 19:18:28.533110 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:18:28.533124 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533132 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-23 19:18:28.533140 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533148 | orchestrator | 2025-09-23 19:18:28.533156 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2025-09-23 19:18:28.533163 | orchestrator | Tuesday 23 September 2025 19:16:09 +0000 (0:00:02.703) 0:03:16.668 ***** 2025-09-23 19:18:28.533180 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:18:28.533194 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-23 19:18:28.533202 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533210 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:18:28.533219 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-23 19:18:28.533227 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533253 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:18:28.533267 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-23 19:18:28.533276 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533283 | orchestrator | 2025-09-23 19:18:28.533291 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2025-09-23 19:18:28.533299 | orchestrator | Tuesday 23 September 2025 19:16:12 +0000 (0:00:02.352) 0:03:19.020 ***** 2025-09-23 19:18:28.533307 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-23 19:18:28.533315 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-23 19:18:28.533324 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533332 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-23 19:18:28.533348 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-23 19:18:28.533361 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533443 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-23 19:18:28.533453 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-23 19:18:28.533462 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533471 | orchestrator | 2025-09-23 19:18:28.533480 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2025-09-23 19:18:28.533489 | orchestrator | Tuesday 23 September 2025 19:16:14 +0000 (0:00:02.720) 0:03:21.741 ***** 2025-09-23 19:18:28.533498 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.533506 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.533515 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.533524 | orchestrator | 2025-09-23 19:18:28.533533 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2025-09-23 19:18:28.533542 | orchestrator | Tuesday 23 September 2025 19:16:16 +0000 (0:00:01.911) 0:03:23.652 ***** 2025-09-23 19:18:28.533551 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533560 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533568 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533575 | orchestrator | 2025-09-23 19:18:28.533582 | orchestrator | TASK [include_role : masakari] ************************************************* 2025-09-23 19:18:28.533590 | orchestrator | Tuesday 23 September 2025 19:16:18 +0000 (0:00:01.392) 0:03:25.045 ***** 2025-09-23 19:18:28.533597 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533605 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533612 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533619 | orchestrator | 2025-09-23 19:18:28.533627 | orchestrator | TASK [include_role : memcached] ************************************************ 2025-09-23 19:18:28.533634 | orchestrator | Tuesday 23 September 2025 19:16:18 +0000 (0:00:00.308) 0:03:25.353 ***** 2025-09-23 19:18:28.533642 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.533649 | orchestrator | 2025-09-23 19:18:28.533656 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2025-09-23 19:18:28.533664 | orchestrator | Tuesday 23 September 2025 19:16:19 +0000 (0:00:01.285) 0:03:26.639 ***** 2025-09-23 19:18:28.533672 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-09-23 19:18:28.533692 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-09-23 19:18:28.533702 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-09-23 19:18:28.533709 | orchestrator | 2025-09-23 19:18:28.533716 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2025-09-23 19:18:28.533722 | orchestrator | Tuesday 23 September 2025 19:16:21 +0000 (0:00:01.519) 0:03:28.159 ***** 2025-09-23 19:18:28.533729 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-09-23 19:18:28.533736 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-09-23 19:18:28.533743 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533750 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533761 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-09-23 19:18:28.533768 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533774 | orchestrator | 2025-09-23 19:18:28.533781 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2025-09-23 19:18:28.533788 | orchestrator | Tuesday 23 September 2025 19:16:21 +0000 (0:00:00.387) 0:03:28.546 ***** 2025-09-23 19:18:28.533797 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-09-23 19:18:28.533804 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533814 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-09-23 19:18:28.533821 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533828 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-09-23 19:18:28.533835 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533841 | orchestrator | 2025-09-23 19:18:28.533848 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2025-09-23 19:18:28.533854 | orchestrator | Tuesday 23 September 2025 19:16:22 +0000 (0:00:00.589) 0:03:29.135 ***** 2025-09-23 19:18:28.533861 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533867 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533874 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533881 | orchestrator | 2025-09-23 19:18:28.533887 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2025-09-23 19:18:28.533894 | orchestrator | Tuesday 23 September 2025 19:16:22 +0000 (0:00:00.756) 0:03:29.892 ***** 2025-09-23 19:18:28.533900 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533907 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533913 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533920 | orchestrator | 2025-09-23 19:18:28.533926 | orchestrator | TASK [include_role : mistral] ************************************************** 2025-09-23 19:18:28.533933 | orchestrator | Tuesday 23 September 2025 19:16:24 +0000 (0:00:01.248) 0:03:31.140 ***** 2025-09-23 19:18:28.533940 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.533946 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.533953 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.533959 | orchestrator | 2025-09-23 19:18:28.533966 | orchestrator | TASK [include_role : neutron] ************************************************** 2025-09-23 19:18:28.533973 | orchestrator | Tuesday 23 September 2025 19:16:24 +0000 (0:00:00.295) 0:03:31.436 ***** 2025-09-23 19:18:28.533979 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.533986 | orchestrator | 2025-09-23 19:18:28.533992 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2025-09-23 19:18:28.534003 | orchestrator | Tuesday 23 September 2025 19:16:25 +0000 (0:00:01.434) 0:03:32.870 ***** 2025-09-23 19:18:28.534010 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-09-23 19:18:28.534132 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534193 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534204 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534212 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-23 19:18:28.534224 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534231 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534239 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534247 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534304 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.534315 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534322 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534334 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534341 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534349 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.534417 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-09-23 19:18:28.534428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.534440 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534447 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534454 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534508 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-23 19:18:28.534518 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534525 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534537 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534544 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-09-23 19:18:28.534552 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534562 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.534616 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534627 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534638 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534645 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534652 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534659 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534713 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-23 19:18:28.534723 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534734 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534741 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.534748 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534755 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.534809 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534819 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534831 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.534839 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534846 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534853 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.534863 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.534914 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.534929 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.534936 | orchestrator | 2025-09-23 19:18:28.534943 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2025-09-23 19:18:28.534950 | orchestrator | Tuesday 23 September 2025 19:16:30 +0000 (0:00:04.212) 0:03:37.083 ***** 2025-09-23 19:18:28.534957 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-09-23 19:18:28.534964 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535017 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535027 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535043 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-23 19:18:28.535050 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535057 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535064 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535071 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535122 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-09-23 19:18:28.535141 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.535148 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535155 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535162 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535172 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535227 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535237 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535244 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535251 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-23 19:18:28.535258 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.535311 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535326 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.535333 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535340 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.535348 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535355 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-09-23 19:18:28.535362 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535388 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535445 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.535456 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535463 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535470 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535477 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535511 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-23 19:18:28.535519 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535526 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535533 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535540 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535548 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.535558 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535588 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535596 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.535603 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.535610 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.535617 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535624 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535636 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-23 19:18:28.535662 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.535671 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-23 19:18:28.535678 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-23 19:18:28.535685 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.535692 | orchestrator | 2025-09-23 19:18:28.535699 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2025-09-23 19:18:28.535706 | orchestrator | Tuesday 23 September 2025 19:16:31 +0000 (0:00:01.474) 0:03:38.558 ***** 2025-09-23 19:18:28.535713 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-09-23 19:18:28.535720 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-09-23 19:18:28.535727 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.535733 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-09-23 19:18:28.535740 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-09-23 19:18:28.535753 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.535760 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-09-23 19:18:28.535766 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-09-23 19:18:28.535773 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.535780 | orchestrator | 2025-09-23 19:18:28.535786 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2025-09-23 19:18:28.535793 | orchestrator | Tuesday 23 September 2025 19:16:33 +0000 (0:00:01.972) 0:03:40.531 ***** 2025-09-23 19:18:28.535800 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.535806 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.535813 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.535820 | orchestrator | 2025-09-23 19:18:28.535826 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2025-09-23 19:18:28.535833 | orchestrator | Tuesday 23 September 2025 19:16:35 +0000 (0:00:01.417) 0:03:41.949 ***** 2025-09-23 19:18:28.535840 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.535846 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.535853 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.535860 | orchestrator | 2025-09-23 19:18:28.535866 | orchestrator | TASK [include_role : placement] ************************************************ 2025-09-23 19:18:28.535873 | orchestrator | Tuesday 23 September 2025 19:16:37 +0000 (0:00:02.148) 0:03:44.098 ***** 2025-09-23 19:18:28.535883 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.535890 | orchestrator | 2025-09-23 19:18:28.535896 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2025-09-23 19:18:28.535903 | orchestrator | Tuesday 23 September 2025 19:16:38 +0000 (0:00:01.172) 0:03:45.270 ***** 2025-09-23 19:18:28.535928 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.535936 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.535944 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.535956 | orchestrator | 2025-09-23 19:18:28.535963 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2025-09-23 19:18:28.535970 | orchestrator | Tuesday 23 September 2025 19:16:42 +0000 (0:00:03.835) 0:03:49.106 ***** 2025-09-23 19:18:28.535977 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.535984 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.536011 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.536019 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.536026 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.536033 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.536044 | orchestrator | 2025-09-23 19:18:28.536051 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2025-09-23 19:18:28.536058 | orchestrator | Tuesday 23 September 2025 19:16:42 +0000 (0:00:00.518) 0:03:49.624 ***** 2025-09-23 19:18:28.536065 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536072 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536079 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.536085 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536092 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536099 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.536106 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536112 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536119 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.536126 | orchestrator | 2025-09-23 19:18:28.536132 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2025-09-23 19:18:28.536139 | orchestrator | Tuesday 23 September 2025 19:16:43 +0000 (0:00:00.756) 0:03:50.381 ***** 2025-09-23 19:18:28.536146 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.536152 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.536158 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.536165 | orchestrator | 2025-09-23 19:18:28.536171 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2025-09-23 19:18:28.536178 | orchestrator | Tuesday 23 September 2025 19:16:44 +0000 (0:00:01.348) 0:03:51.729 ***** 2025-09-23 19:18:28.536184 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.536191 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.536198 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.536204 | orchestrator | 2025-09-23 19:18:28.536211 | orchestrator | TASK [include_role : nova] ***************************************************** 2025-09-23 19:18:28.536220 | orchestrator | Tuesday 23 September 2025 19:16:47 +0000 (0:00:02.234) 0:03:53.963 ***** 2025-09-23 19:18:28.536227 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.536234 | orchestrator | 2025-09-23 19:18:28.536240 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2025-09-23 19:18:28.536265 | orchestrator | Tuesday 23 September 2025 19:16:48 +0000 (0:00:01.466) 0:03:55.430 ***** 2025-09-23 19:18:28.536274 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.536286 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536294 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536301 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.536328 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536337 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.536348 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536355 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536401 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536411 | orchestrator | 2025-09-23 19:18:28.536418 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2025-09-23 19:18:28.536425 | orchestrator | Tuesday 23 September 2025 19:16:52 +0000 (0:00:04.140) 0:03:59.571 ***** 2025-09-23 19:18:28.536454 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.536468 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536475 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536482 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.536502 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.536510 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536539 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536552 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.536560 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.536567 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536574 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.536581 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.536588 | orchestrator | 2025-09-23 19:18:28.536595 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2025-09-23 19:18:28.536601 | orchestrator | Tuesday 23 September 2025 19:16:53 +0000 (0:00:00.918) 0:04:00.490 ***** 2025-09-23 19:18:28.536608 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536615 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536623 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536629 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536636 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.536643 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536676 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536685 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536692 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536699 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536705 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.536711 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536717 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536724 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-23 19:18:28.536730 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.536736 | orchestrator | 2025-09-23 19:18:28.536743 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2025-09-23 19:18:28.536749 | orchestrator | Tuesday 23 September 2025 19:16:54 +0000 (0:00:01.188) 0:04:01.678 ***** 2025-09-23 19:18:28.536755 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.536761 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.536767 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.536774 | orchestrator | 2025-09-23 19:18:28.536780 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2025-09-23 19:18:28.536786 | orchestrator | Tuesday 23 September 2025 19:16:56 +0000 (0:00:01.418) 0:04:03.097 ***** 2025-09-23 19:18:28.536792 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.536798 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.536804 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.536810 | orchestrator | 2025-09-23 19:18:28.536817 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2025-09-23 19:18:28.536823 | orchestrator | Tuesday 23 September 2025 19:16:58 +0000 (0:00:02.098) 0:04:05.196 ***** 2025-09-23 19:18:28.536829 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.536835 | orchestrator | 2025-09-23 19:18:28.536841 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2025-09-23 19:18:28.536847 | orchestrator | Tuesday 23 September 2025 19:16:59 +0000 (0:00:01.497) 0:04:06.694 ***** 2025-09-23 19:18:28.536854 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2025-09-23 19:18:28.536860 | orchestrator | 2025-09-23 19:18:28.536866 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2025-09-23 19:18:28.536872 | orchestrator | Tuesday 23 September 2025 19:17:00 +0000 (0:00:00.828) 0:04:07.522 ***** 2025-09-23 19:18:28.536879 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-09-23 19:18:28.536894 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-09-23 19:18:28.536922 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-09-23 19:18:28.536930 | orchestrator | 2025-09-23 19:18:28.536936 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2025-09-23 19:18:28.536942 | orchestrator | Tuesday 23 September 2025 19:17:04 +0000 (0:00:04.229) 0:04:11.751 ***** 2025-09-23 19:18:28.536948 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.536955 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.536961 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.536968 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.536974 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.536980 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.536987 | orchestrator | 2025-09-23 19:18:28.536993 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2025-09-23 19:18:28.536999 | orchestrator | Tuesday 23 September 2025 19:17:06 +0000 (0:00:01.487) 0:04:13.239 ***** 2025-09-23 19:18:28.537005 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-23 19:18:28.537012 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-23 19:18:28.537023 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537029 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-23 19:18:28.537036 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-23 19:18:28.537042 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537049 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-23 19:18:28.537055 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-23 19:18:28.537061 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537068 | orchestrator | 2025-09-23 19:18:28.537074 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-09-23 19:18:28.537080 | orchestrator | Tuesday 23 September 2025 19:17:07 +0000 (0:00:01.532) 0:04:14.771 ***** 2025-09-23 19:18:28.537086 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.537095 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.537102 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.537108 | orchestrator | 2025-09-23 19:18:28.537114 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-09-23 19:18:28.537137 | orchestrator | Tuesday 23 September 2025 19:17:10 +0000 (0:00:02.646) 0:04:17.418 ***** 2025-09-23 19:18:28.537144 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.537150 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.537157 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.537162 | orchestrator | 2025-09-23 19:18:28.537169 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2025-09-23 19:18:28.537175 | orchestrator | Tuesday 23 September 2025 19:17:13 +0000 (0:00:03.020) 0:04:20.438 ***** 2025-09-23 19:18:28.537181 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2025-09-23 19:18:28.537187 | orchestrator | 2025-09-23 19:18:28.537194 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2025-09-23 19:18:28.537200 | orchestrator | Tuesday 23 September 2025 19:17:14 +0000 (0:00:01.388) 0:04:21.827 ***** 2025-09-23 19:18:28.537206 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.537213 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537219 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.537230 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537236 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.537243 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537249 | orchestrator | 2025-09-23 19:18:28.537255 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2025-09-23 19:18:28.537261 | orchestrator | Tuesday 23 September 2025 19:17:16 +0000 (0:00:01.309) 0:04:23.137 ***** 2025-09-23 19:18:28.537268 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.537274 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537280 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.537286 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537312 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-23 19:18:28.537320 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537326 | orchestrator | 2025-09-23 19:18:28.537332 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2025-09-23 19:18:28.537339 | orchestrator | Tuesday 23 September 2025 19:17:17 +0000 (0:00:01.394) 0:04:24.531 ***** 2025-09-23 19:18:28.537345 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537351 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537357 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537378 | orchestrator | 2025-09-23 19:18:28.537384 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-09-23 19:18:28.537391 | orchestrator | Tuesday 23 September 2025 19:17:19 +0000 (0:00:01.801) 0:04:26.332 ***** 2025-09-23 19:18:28.537397 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.537403 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.537409 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.537415 | orchestrator | 2025-09-23 19:18:28.537421 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-09-23 19:18:28.537432 | orchestrator | Tuesday 23 September 2025 19:17:21 +0000 (0:00:02.324) 0:04:28.656 ***** 2025-09-23 19:18:28.537438 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.537444 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.537450 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.537456 | orchestrator | 2025-09-23 19:18:28.537463 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2025-09-23 19:18:28.537469 | orchestrator | Tuesday 23 September 2025 19:17:24 +0000 (0:00:02.918) 0:04:31.575 ***** 2025-09-23 19:18:28.537475 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2025-09-23 19:18:28.537481 | orchestrator | 2025-09-23 19:18:28.537488 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2025-09-23 19:18:28.537494 | orchestrator | Tuesday 23 September 2025 19:17:25 +0000 (0:00:00.842) 0:04:32.418 ***** 2025-09-23 19:18:28.537500 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-23 19:18:28.537507 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537513 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-23 19:18:28.537519 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537526 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-23 19:18:28.537532 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537538 | orchestrator | 2025-09-23 19:18:28.537544 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2025-09-23 19:18:28.537551 | orchestrator | Tuesday 23 September 2025 19:17:26 +0000 (0:00:01.284) 0:04:33.702 ***** 2025-09-23 19:18:28.537557 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-23 19:18:28.537567 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537592 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-23 19:18:28.537604 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537610 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-23 19:18:28.537617 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537623 | orchestrator | 2025-09-23 19:18:28.537629 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2025-09-23 19:18:28.537635 | orchestrator | Tuesday 23 September 2025 19:17:28 +0000 (0:00:01.274) 0:04:34.977 ***** 2025-09-23 19:18:28.537641 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.537647 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.537653 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.537660 | orchestrator | 2025-09-23 19:18:28.537666 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-09-23 19:18:28.537672 | orchestrator | Tuesday 23 September 2025 19:17:29 +0000 (0:00:01.541) 0:04:36.519 ***** 2025-09-23 19:18:28.537678 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.537684 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.537691 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.537697 | orchestrator | 2025-09-23 19:18:28.537703 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-09-23 19:18:28.537709 | orchestrator | Tuesday 23 September 2025 19:17:31 +0000 (0:00:02.362) 0:04:38.881 ***** 2025-09-23 19:18:28.537715 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:18:28.537722 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:18:28.537728 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:18:28.537734 | orchestrator | 2025-09-23 19:18:28.537740 | orchestrator | TASK [include_role : octavia] ************************************************** 2025-09-23 19:18:28.537746 | orchestrator | Tuesday 23 September 2025 19:17:35 +0000 (0:00:03.414) 0:04:42.296 ***** 2025-09-23 19:18:28.537752 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.537758 | orchestrator | 2025-09-23 19:18:28.537765 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2025-09-23 19:18:28.537771 | orchestrator | Tuesday 23 September 2025 19:17:36 +0000 (0:00:01.469) 0:04:43.765 ***** 2025-09-23 19:18:28.537777 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.537784 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-23 19:18:28.537814 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537822 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537829 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.537835 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.537842 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-23 19:18:28.537848 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537861 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537885 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.537893 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.537899 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-23 19:18:28.537906 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537913 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537923 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.537929 | orchestrator | 2025-09-23 19:18:28.537936 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2025-09-23 19:18:28.537945 | orchestrator | Tuesday 23 September 2025 19:17:40 +0000 (0:00:03.269) 0:04:47.035 ***** 2025-09-23 19:18:28.537969 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.537977 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-23 19:18:28.537983 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537990 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.537997 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.538009 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.538038 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.538064 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-23 19:18:28.538071 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.538080 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.538086 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.538097 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-23 19:18:28.538104 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.538110 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.538139 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.538147 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-23 19:18:28.538154 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-23 19:18:28.538160 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.538166 | orchestrator | 2025-09-23 19:18:28.538173 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2025-09-23 19:18:28.538179 | orchestrator | Tuesday 23 September 2025 19:17:40 +0000 (0:00:00.679) 0:04:47.715 ***** 2025-09-23 19:18:28.538185 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-23 19:18:28.538192 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-23 19:18:28.538198 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.538208 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-23 19:18:28.538215 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-23 19:18:28.538221 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.538227 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-23 19:18:28.538234 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-23 19:18:28.538240 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.538246 | orchestrator | 2025-09-23 19:18:28.538252 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2025-09-23 19:18:28.538258 | orchestrator | Tuesday 23 September 2025 19:17:42 +0000 (0:00:01.420) 0:04:49.136 ***** 2025-09-23 19:18:28.538265 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.538271 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.538277 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.538283 | orchestrator | 2025-09-23 19:18:28.538289 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2025-09-23 19:18:28.538295 | orchestrator | Tuesday 23 September 2025 19:17:43 +0000 (0:00:01.597) 0:04:50.733 ***** 2025-09-23 19:18:28.538301 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.538307 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.538314 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.538320 | orchestrator | 2025-09-23 19:18:28.538326 | orchestrator | TASK [include_role : opensearch] *********************************************** 2025-09-23 19:18:28.538332 | orchestrator | Tuesday 23 September 2025 19:17:45 +0000 (0:00:02.106) 0:04:52.840 ***** 2025-09-23 19:18:28.538342 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.538348 | orchestrator | 2025-09-23 19:18:28.538354 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2025-09-23 19:18:28.538360 | orchestrator | Tuesday 23 September 2025 19:17:47 +0000 (0:00:01.392) 0:04:54.232 ***** 2025-09-23 19:18:28.538420 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:18:28.538429 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:18:28.538441 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:18:28.538448 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:18:28.538475 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:18:28.538484 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:18:28.538496 | orchestrator | 2025-09-23 19:18:28.538502 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2025-09-23 19:18:28.538509 | orchestrator | Tuesday 23 September 2025 19:17:52 +0000 (0:00:05.324) 0:04:59.557 ***** 2025-09-23 19:18:28.538515 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:18:28.538522 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:18:28.538529 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.538555 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:18:28.538564 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:18:28.538574 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.538580 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:18:28.538586 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:18:28.538592 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.538598 | orchestrator | 2025-09-23 19:18:28.538603 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2025-09-23 19:18:28.538609 | orchestrator | Tuesday 23 September 2025 19:17:53 +0000 (0:00:00.700) 0:05:00.258 ***** 2025-09-23 19:18:28.538614 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-09-23 19:18:28.538623 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-23 19:18:28.538643 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-23 19:18:28.538650 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.538656 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-09-23 19:18:28.538661 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-23 19:18:28.538667 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-23 19:18:28.538676 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.538682 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-09-23 19:18:28.538688 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-23 19:18:28.538693 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-23 19:18:28.538699 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.538704 | orchestrator | 2025-09-23 19:18:28.538710 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2025-09-23 19:18:28.538715 | orchestrator | Tuesday 23 September 2025 19:17:54 +0000 (0:00:00.910) 0:05:01.169 ***** 2025-09-23 19:18:28.538721 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.538726 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.538731 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.538737 | orchestrator | 2025-09-23 19:18:28.538742 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2025-09-23 19:18:28.538748 | orchestrator | Tuesday 23 September 2025 19:17:55 +0000 (0:00:00.782) 0:05:01.951 ***** 2025-09-23 19:18:28.538753 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.538758 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.538764 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.538769 | orchestrator | 2025-09-23 19:18:28.538774 | orchestrator | TASK [include_role : prometheus] *********************************************** 2025-09-23 19:18:28.538780 | orchestrator | Tuesday 23 September 2025 19:17:56 +0000 (0:00:01.350) 0:05:03.301 ***** 2025-09-23 19:18:28.538785 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.538791 | orchestrator | 2025-09-23 19:18:28.538796 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2025-09-23 19:18:28.538802 | orchestrator | Tuesday 23 September 2025 19:17:57 +0000 (0:00:01.397) 0:05:04.699 ***** 2025-09-23 19:18:28.538807 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-23 19:18:28.538816 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:18:28.538839 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-23 19:18:28.538849 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538856 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538861 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:18:28.538867 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.538873 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538879 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538892 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.538902 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-23 19:18:28.538908 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:18:28.538914 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538920 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538926 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.538934 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-23 19:18:28.538948 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-23 19:18:28.538954 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538959 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538965 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.538971 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-23 19:18:28.538980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-23 19:18:28.538992 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.538998 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539004 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539009 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-23 19:18:28.539015 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-23 19:18:28.539025 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539036 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539042 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539048 | orchestrator | 2025-09-23 19:18:28.539053 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2025-09-23 19:18:28.539059 | orchestrator | Tuesday 23 September 2025 19:18:02 +0000 (0:00:04.508) 0:05:09.207 ***** 2025-09-23 19:18:28.539064 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-23 19:18:28.539070 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:18:28.539076 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539081 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539090 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539101 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-23 19:18:28.539108 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-23 19:18:28.539113 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539119 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539125 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539134 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539140 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-23 19:18:28.539150 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:18:28.539156 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539162 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539168 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539173 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-23 19:18:28.539183 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-23 19:18:28.539194 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-23 19:18:28.539200 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:18:28.539206 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539211 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539217 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539223 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539232 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539238 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539243 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539256 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-23 19:18:28.539263 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-23 19:18:28.539269 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539274 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:18:28.539284 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:18:28.539289 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539295 | orchestrator | 2025-09-23 19:18:28.539300 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2025-09-23 19:18:28.539306 | orchestrator | Tuesday 23 September 2025 19:18:03 +0000 (0:00:01.223) 0:05:10.431 ***** 2025-09-23 19:18:28.539311 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-09-23 19:18:28.539317 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-09-23 19:18:28.539325 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-23 19:18:28.539334 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-23 19:18:28.539340 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539346 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-09-23 19:18:28.539351 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-09-23 19:18:28.539357 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-23 19:18:28.539373 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-23 19:18:28.539379 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539384 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-09-23 19:18:28.539390 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-09-23 19:18:28.539399 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-23 19:18:28.539405 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-23 19:18:28.539411 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539416 | orchestrator | 2025-09-23 19:18:28.539422 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2025-09-23 19:18:28.539427 | orchestrator | Tuesday 23 September 2025 19:18:04 +0000 (0:00:00.980) 0:05:11.411 ***** 2025-09-23 19:18:28.539433 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539438 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539443 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539449 | orchestrator | 2025-09-23 19:18:28.539454 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2025-09-23 19:18:28.539460 | orchestrator | Tuesday 23 September 2025 19:18:04 +0000 (0:00:00.441) 0:05:11.853 ***** 2025-09-23 19:18:28.539465 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539470 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539476 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539481 | orchestrator | 2025-09-23 19:18:28.539487 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2025-09-23 19:18:28.539492 | orchestrator | Tuesday 23 September 2025 19:18:06 +0000 (0:00:01.473) 0:05:13.327 ***** 2025-09-23 19:18:28.539498 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.539503 | orchestrator | 2025-09-23 19:18:28.539508 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2025-09-23 19:18:28.539514 | orchestrator | Tuesday 23 September 2025 19:18:08 +0000 (0:00:01.750) 0:05:15.077 ***** 2025-09-23 19:18:28.539525 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:18:28.539531 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:18:28.539544 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-23 19:18:28.539550 | orchestrator | 2025-09-23 19:18:28.539556 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2025-09-23 19:18:28.539561 | orchestrator | Tuesday 23 September 2025 19:18:10 +0000 (0:00:02.590) 0:05:17.668 ***** 2025-09-23 19:18:28.539567 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-09-23 19:18:28.539578 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-09-23 19:18:28.539585 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539590 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539596 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-09-23 19:18:28.539606 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539612 | orchestrator | 2025-09-23 19:18:28.539617 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2025-09-23 19:18:28.539622 | orchestrator | Tuesday 23 September 2025 19:18:11 +0000 (0:00:00.418) 0:05:18.086 ***** 2025-09-23 19:18:28.539628 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-09-23 19:18:28.539633 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539639 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-09-23 19:18:28.539644 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539650 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-09-23 19:18:28.539655 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539661 | orchestrator | 2025-09-23 19:18:28.539666 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2025-09-23 19:18:28.539671 | orchestrator | Tuesday 23 September 2025 19:18:12 +0000 (0:00:00.955) 0:05:19.042 ***** 2025-09-23 19:18:28.539677 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539682 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539687 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539693 | orchestrator | 2025-09-23 19:18:28.539698 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2025-09-23 19:18:28.539704 | orchestrator | Tuesday 23 September 2025 19:18:12 +0000 (0:00:00.444) 0:05:19.487 ***** 2025-09-23 19:18:28.539709 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539714 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539720 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539725 | orchestrator | 2025-09-23 19:18:28.539731 | orchestrator | TASK [include_role : skyline] ************************************************** 2025-09-23 19:18:28.539736 | orchestrator | Tuesday 23 September 2025 19:18:13 +0000 (0:00:01.321) 0:05:20.808 ***** 2025-09-23 19:18:28.539741 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:18:28.539747 | orchestrator | 2025-09-23 19:18:28.539752 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2025-09-23 19:18:28.539758 | orchestrator | Tuesday 23 September 2025 19:18:15 +0000 (0:00:01.736) 0:05:22.544 ***** 2025-09-23 19:18:28.539763 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.539776 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.539825 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.539838 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.539844 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.539857 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-09-23 19:18:28.539869 | orchestrator | 2025-09-23 19:18:28.539874 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2025-09-23 19:18:28.539880 | orchestrator | Tuesday 23 September 2025 19:18:21 +0000 (0:00:06.072) 0:05:28.617 ***** 2025-09-23 19:18:28.539886 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.539892 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.539897 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539903 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.539914 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.539924 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.539930 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.539936 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-09-23 19:18:28.539941 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.539947 | orchestrator | 2025-09-23 19:18:28.539952 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2025-09-23 19:18:28.539958 | orchestrator | Tuesday 23 September 2025 19:18:22 +0000 (0:00:00.655) 0:05:29.273 ***** 2025-09-23 19:18:28.539964 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-23 19:18:28.539970 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-23 19:18:28.539976 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-23 19:18:28.539981 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-23 19:18:28.539987 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:18:28.539996 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540001 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540010 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540018 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540024 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:18:28.540030 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540035 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540041 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-23 19:18:28.540052 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:18:28.540057 | orchestrator | 2025-09-23 19:18:28.540062 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2025-09-23 19:18:28.540068 | orchestrator | Tuesday 23 September 2025 19:18:23 +0000 (0:00:01.567) 0:05:30.840 ***** 2025-09-23 19:18:28.540073 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.540079 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.540084 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.540090 | orchestrator | 2025-09-23 19:18:28.540095 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2025-09-23 19:18:28.540100 | orchestrator | Tuesday 23 September 2025 19:18:25 +0000 (0:00:01.440) 0:05:32.281 ***** 2025-09-23 19:18:28.540106 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:18:28.540111 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:18:28.540117 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:18:28.540122 | orchestrator | 2025-09-23 19:18:28.540127 | orchestrator | TASK [include_role : swift] **************************************************** 2025-09-23 19:18:28.540132 | orchestrator | Tuesday 23 September 2025 19:18:27 +0000 (0:00:02.060) 0:05:34.341 ***** 2025-09-23 19:18:28.540138 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"msg": "The conditional check 'enable_swift | bool' failed. The error was: error while evaluating conditional (enable_swift | bool): 'enable_swift' is undefined\n\nThe error appears to be in '/ansible/kolla-loadbalancer.yml': line 207, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n when: enable_skyline | bool\n - include_role:\n ^ here\n"} 2025-09-23 19:18:28.540144 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"msg": "The conditional check 'enable_swift | bool' failed. The error was: error while evaluating conditional (enable_swift | bool): 'enable_swift' is undefined\n\nThe error appears to be in '/ansible/kolla-loadbalancer.yml': line 207, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n when: enable_skyline | bool\n - include_role:\n ^ here\n"} 2025-09-23 19:18:28.540154 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"msg": "The conditional check 'enable_swift | bool' failed. The error was: error while evaluating conditional (enable_swift | bool): 'enable_swift' is undefined\n\nThe error appears to be in '/ansible/kolla-loadbalancer.yml': line 207, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n when: enable_skyline | bool\n - include_role:\n ^ here\n"} 2025-09-23 19:18:28.540159 | orchestrator | 2025-09-23 19:18:28.540165 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:18:28.540171 | orchestrator | testbed-node-0 : ok=111  changed=73  unreachable=0 failed=1  skipped=85  rescued=0 ignored=0 2025-09-23 19:18:28.540176 | orchestrator | testbed-node-1 : ok=110  changed=73  unreachable=0 failed=1  skipped=85  rescued=0 ignored=0 2025-09-23 19:18:28.540182 | orchestrator | testbed-node-2 : ok=110  changed=73  unreachable=0 failed=1  skipped=85  rescued=0 ignored=0 2025-09-23 19:18:28.540187 | orchestrator | 2025-09-23 19:18:28.540193 | orchestrator | 2025-09-23 19:18:28.540198 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:18:28.540206 | orchestrator | Tuesday 23 September 2025 19:18:27 +0000 (0:00:00.240) 0:05:34.582 ***** 2025-09-23 19:18:28.540212 | orchestrator | =============================================================================== 2025-09-23 19:18:28.540220 | orchestrator | haproxy-config : Copying over skyline haproxy config -------------------- 6.07s 2025-09-23 19:18:28.540225 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 5.32s 2025-09-23 19:18:28.540231 | orchestrator | haproxy-config : Copying over magnum haproxy config --------------------- 4.73s 2025-09-23 19:18:28.540236 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 4.56s 2025-09-23 19:18:28.540242 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 4.51s 2025-09-23 19:18:28.540247 | orchestrator | haproxy-config : Copying over horizon haproxy config -------------------- 4.40s 2025-09-23 19:18:28.540252 | orchestrator | haproxy-config : Copying over aodh haproxy config ----------------------- 4.31s 2025-09-23 19:18:28.540258 | orchestrator | haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config --- 4.23s 2025-09-23 19:18:28.540263 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 4.21s 2025-09-23 19:18:28.540269 | orchestrator | haproxy-config : Copying over keystone haproxy config ------------------- 4.15s 2025-09-23 19:18:28.540274 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 4.14s 2025-09-23 19:18:28.540279 | orchestrator | haproxy-config : Copying over designate haproxy config ------------------ 4.14s 2025-09-23 19:18:28.540285 | orchestrator | haproxy-config : Copying over cinder haproxy config --------------------- 4.10s 2025-09-23 19:18:28.540290 | orchestrator | service-cert-copy : loadbalancer | Copying over extra CA certificates --- 4.02s 2025-09-23 19:18:28.540295 | orchestrator | haproxy-config : Add configuration for glance when using single external frontend --- 3.84s 2025-09-23 19:18:28.540301 | orchestrator | haproxy-config : Copying over placement haproxy config ------------------ 3.84s 2025-09-23 19:18:28.540306 | orchestrator | haproxy-config : Configuring firewall for glance ------------------------ 3.83s 2025-09-23 19:18:28.540312 | orchestrator | sysctl : Setting sysctl values ------------------------------------------ 3.72s 2025-09-23 19:18:28.540317 | orchestrator | haproxy-config : Copying over grafana haproxy config -------------------- 3.67s 2025-09-23 19:18:28.540322 | orchestrator | haproxy-config : Copying over manila haproxy config --------------------- 3.60s 2025-09-23 19:18:28.540332 | orchestrator | 2025-09-23 19:18:28 | INFO  | Task 82e28fa0-ff78-428e-9db2-f885c6c4c6d1 is in state SUCCESS 2025-09-23 19:18:28.540338 | orchestrator | 2025-09-23 19:18:28 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:28.540343 | orchestrator | 2025-09-23 19:18:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:31.568897 | orchestrator | 2025-09-23 19:18:31 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:31.571848 | orchestrator | 2025-09-23 19:18:31 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:31.575247 | orchestrator | 2025-09-23 19:18:31 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:31.575277 | orchestrator | 2025-09-23 19:18:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:34.608382 | orchestrator | 2025-09-23 19:18:34 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:34.609216 | orchestrator | 2025-09-23 19:18:34 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:34.610465 | orchestrator | 2025-09-23 19:18:34 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:34.610478 | orchestrator | 2025-09-23 19:18:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:37.657935 | orchestrator | 2025-09-23 19:18:37 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:37.658084 | orchestrator | 2025-09-23 19:18:37 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:37.658313 | orchestrator | 2025-09-23 19:18:37 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:37.658480 | orchestrator | 2025-09-23 19:18:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:40.698508 | orchestrator | 2025-09-23 19:18:40 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:40.699938 | orchestrator | 2025-09-23 19:18:40 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:40.701993 | orchestrator | 2025-09-23 19:18:40 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:40.702423 | orchestrator | 2025-09-23 19:18:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:43.732763 | orchestrator | 2025-09-23 19:18:43 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:43.735423 | orchestrator | 2025-09-23 19:18:43 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:43.735822 | orchestrator | 2025-09-23 19:18:43 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:43.736019 | orchestrator | 2025-09-23 19:18:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:46.764038 | orchestrator | 2025-09-23 19:18:46 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:46.764618 | orchestrator | 2025-09-23 19:18:46 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:46.765786 | orchestrator | 2025-09-23 19:18:46 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:46.766004 | orchestrator | 2025-09-23 19:18:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:49.796805 | orchestrator | 2025-09-23 19:18:49 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:49.797171 | orchestrator | 2025-09-23 19:18:49 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:49.797807 | orchestrator | 2025-09-23 19:18:49 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:49.797914 | orchestrator | 2025-09-23 19:18:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:52.827738 | orchestrator | 2025-09-23 19:18:52 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:52.829320 | orchestrator | 2025-09-23 19:18:52 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:52.831202 | orchestrator | 2025-09-23 19:18:52 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:52.831230 | orchestrator | 2025-09-23 19:18:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:55.873655 | orchestrator | 2025-09-23 19:18:55 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:55.873954 | orchestrator | 2025-09-23 19:18:55 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:55.877191 | orchestrator | 2025-09-23 19:18:55 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:55.877226 | orchestrator | 2025-09-23 19:18:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:18:58.900885 | orchestrator | 2025-09-23 19:18:58 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:18:58.901080 | orchestrator | 2025-09-23 19:18:58 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:18:58.901898 | orchestrator | 2025-09-23 19:18:58 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:18:58.901923 | orchestrator | 2025-09-23 19:18:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:02.021636 | orchestrator | 2025-09-23 19:19:02 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:02.021998 | orchestrator | 2025-09-23 19:19:02 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:02.022829 | orchestrator | 2025-09-23 19:19:02 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:02.022859 | orchestrator | 2025-09-23 19:19:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:05.067291 | orchestrator | 2025-09-23 19:19:05 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:05.069397 | orchestrator | 2025-09-23 19:19:05 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:05.072406 | orchestrator | 2025-09-23 19:19:05 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:05.072451 | orchestrator | 2025-09-23 19:19:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:08.113175 | orchestrator | 2025-09-23 19:19:08 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:08.115916 | orchestrator | 2025-09-23 19:19:08 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:08.119603 | orchestrator | 2025-09-23 19:19:08 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:08.120017 | orchestrator | 2025-09-23 19:19:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:11.165624 | orchestrator | 2025-09-23 19:19:11 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:11.167709 | orchestrator | 2025-09-23 19:19:11 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:11.171856 | orchestrator | 2025-09-23 19:19:11 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:11.171981 | orchestrator | 2025-09-23 19:19:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:14.215727 | orchestrator | 2025-09-23 19:19:14 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:14.215848 | orchestrator | 2025-09-23 19:19:14 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:14.216921 | orchestrator | 2025-09-23 19:19:14 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:14.216968 | orchestrator | 2025-09-23 19:19:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:17.257192 | orchestrator | 2025-09-23 19:19:17 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:17.258640 | orchestrator | 2025-09-23 19:19:17 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:17.260389 | orchestrator | 2025-09-23 19:19:17 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:17.260426 | orchestrator | 2025-09-23 19:19:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:20.300624 | orchestrator | 2025-09-23 19:19:20 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:20.303008 | orchestrator | 2025-09-23 19:19:20 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:20.305731 | orchestrator | 2025-09-23 19:19:20 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:20.305756 | orchestrator | 2025-09-23 19:19:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:23.348036 | orchestrator | 2025-09-23 19:19:23 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:23.350501 | orchestrator | 2025-09-23 19:19:23 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:23.351998 | orchestrator | 2025-09-23 19:19:23 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:23.352436 | orchestrator | 2025-09-23 19:19:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:26.407904 | orchestrator | 2025-09-23 19:19:26 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:26.410654 | orchestrator | 2025-09-23 19:19:26 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:26.412708 | orchestrator | 2025-09-23 19:19:26 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:26.412768 | orchestrator | 2025-09-23 19:19:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:29.453143 | orchestrator | 2025-09-23 19:19:29 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:29.455255 | orchestrator | 2025-09-23 19:19:29 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:29.456521 | orchestrator | 2025-09-23 19:19:29 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:29.456547 | orchestrator | 2025-09-23 19:19:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:32.497442 | orchestrator | 2025-09-23 19:19:32 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:32.498985 | orchestrator | 2025-09-23 19:19:32 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:32.501970 | orchestrator | 2025-09-23 19:19:32 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:32.502119 | orchestrator | 2025-09-23 19:19:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:35.544222 | orchestrator | 2025-09-23 19:19:35 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:35.545570 | orchestrator | 2025-09-23 19:19:35 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:35.546786 | orchestrator | 2025-09-23 19:19:35 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:35.546799 | orchestrator | 2025-09-23 19:19:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:38.605076 | orchestrator | 2025-09-23 19:19:38 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:38.607009 | orchestrator | 2025-09-23 19:19:38 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:38.608663 | orchestrator | 2025-09-23 19:19:38 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:38.608685 | orchestrator | 2025-09-23 19:19:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:41.659868 | orchestrator | 2025-09-23 19:19:41 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:41.661798 | orchestrator | 2025-09-23 19:19:41 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:41.664019 | orchestrator | 2025-09-23 19:19:41 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:41.664074 | orchestrator | 2025-09-23 19:19:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:44.708080 | orchestrator | 2025-09-23 19:19:44 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:44.711312 | orchestrator | 2025-09-23 19:19:44 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:44.714150 | orchestrator | 2025-09-23 19:19:44 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:44.714180 | orchestrator | 2025-09-23 19:19:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:47.762362 | orchestrator | 2025-09-23 19:19:47 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:47.765173 | orchestrator | 2025-09-23 19:19:47 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:47.765202 | orchestrator | 2025-09-23 19:19:47 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:47.765214 | orchestrator | 2025-09-23 19:19:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:50.822489 | orchestrator | 2025-09-23 19:19:50 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:50.822586 | orchestrator | 2025-09-23 19:19:50 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:50.826445 | orchestrator | 2025-09-23 19:19:50 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:50.826496 | orchestrator | 2025-09-23 19:19:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:53.860354 | orchestrator | 2025-09-23 19:19:53 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:53.861131 | orchestrator | 2025-09-23 19:19:53 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:53.861878 | orchestrator | 2025-09-23 19:19:53 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:53.861902 | orchestrator | 2025-09-23 19:19:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:56.909667 | orchestrator | 2025-09-23 19:19:56 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:56.912154 | orchestrator | 2025-09-23 19:19:56 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:56.914936 | orchestrator | 2025-09-23 19:19:56 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:56.915224 | orchestrator | 2025-09-23 19:19:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:19:59.961223 | orchestrator | 2025-09-23 19:19:59 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:19:59.962215 | orchestrator | 2025-09-23 19:19:59 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:19:59.963476 | orchestrator | 2025-09-23 19:19:59 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:19:59.963509 | orchestrator | 2025-09-23 19:19:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:03.005057 | orchestrator | 2025-09-23 19:20:03 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:03.006145 | orchestrator | 2025-09-23 19:20:03 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:03.007782 | orchestrator | 2025-09-23 19:20:03 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:03.007830 | orchestrator | 2025-09-23 19:20:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:06.044442 | orchestrator | 2025-09-23 19:20:06 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:06.044680 | orchestrator | 2025-09-23 19:20:06 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:06.045365 | orchestrator | 2025-09-23 19:20:06 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:06.045391 | orchestrator | 2025-09-23 19:20:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:09.081951 | orchestrator | 2025-09-23 19:20:09 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:09.083441 | orchestrator | 2025-09-23 19:20:09 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:09.085068 | orchestrator | 2025-09-23 19:20:09 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:09.085094 | orchestrator | 2025-09-23 19:20:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:12.129144 | orchestrator | 2025-09-23 19:20:12 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:12.130489 | orchestrator | 2025-09-23 19:20:12 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:12.132070 | orchestrator | 2025-09-23 19:20:12 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:12.132342 | orchestrator | 2025-09-23 19:20:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:15.177039 | orchestrator | 2025-09-23 19:20:15 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:15.178827 | orchestrator | 2025-09-23 19:20:15 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:15.183974 | orchestrator | 2025-09-23 19:20:15 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:15.184016 | orchestrator | 2025-09-23 19:20:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:18.224917 | orchestrator | 2025-09-23 19:20:18 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:18.228918 | orchestrator | 2025-09-23 19:20:18 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:18.231560 | orchestrator | 2025-09-23 19:20:18 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:18.231629 | orchestrator | 2025-09-23 19:20:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:21.272364 | orchestrator | 2025-09-23 19:20:21 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:21.273962 | orchestrator | 2025-09-23 19:20:21 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:21.275931 | orchestrator | 2025-09-23 19:20:21 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:21.275962 | orchestrator | 2025-09-23 19:20:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:24.324747 | orchestrator | 2025-09-23 19:20:24 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:24.326919 | orchestrator | 2025-09-23 19:20:24 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:24.328555 | orchestrator | 2025-09-23 19:20:24 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:24.328897 | orchestrator | 2025-09-23 19:20:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:27.375437 | orchestrator | 2025-09-23 19:20:27 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:27.376916 | orchestrator | 2025-09-23 19:20:27 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:27.378857 | orchestrator | 2025-09-23 19:20:27 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:27.379028 | orchestrator | 2025-09-23 19:20:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:30.424177 | orchestrator | 2025-09-23 19:20:30 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:30.424420 | orchestrator | 2025-09-23 19:20:30 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:30.427147 | orchestrator | 2025-09-23 19:20:30 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:30.427202 | orchestrator | 2025-09-23 19:20:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:33.466389 | orchestrator | 2025-09-23 19:20:33 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:33.468095 | orchestrator | 2025-09-23 19:20:33 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:33.469674 | orchestrator | 2025-09-23 19:20:33 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:33.469702 | orchestrator | 2025-09-23 19:20:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:36.508552 | orchestrator | 2025-09-23 19:20:36 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:36.511410 | orchestrator | 2025-09-23 19:20:36 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:36.513156 | orchestrator | 2025-09-23 19:20:36 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:36.513216 | orchestrator | 2025-09-23 19:20:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:39.564868 | orchestrator | 2025-09-23 19:20:39 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:39.565703 | orchestrator | 2025-09-23 19:20:39 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:39.567074 | orchestrator | 2025-09-23 19:20:39 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:39.567153 | orchestrator | 2025-09-23 19:20:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:42.614917 | orchestrator | 2025-09-23 19:20:42 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:42.616905 | orchestrator | 2025-09-23 19:20:42 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:42.619142 | orchestrator | 2025-09-23 19:20:42 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:42.619242 | orchestrator | 2025-09-23 19:20:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:45.668112 | orchestrator | 2025-09-23 19:20:45 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:45.670542 | orchestrator | 2025-09-23 19:20:45 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:45.672940 | orchestrator | 2025-09-23 19:20:45 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:45.673368 | orchestrator | 2025-09-23 19:20:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:48.713409 | orchestrator | 2025-09-23 19:20:48 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:48.716049 | orchestrator | 2025-09-23 19:20:48 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:48.717541 | orchestrator | 2025-09-23 19:20:48 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:48.717566 | orchestrator | 2025-09-23 19:20:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:51.757817 | orchestrator | 2025-09-23 19:20:51 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:51.759205 | orchestrator | 2025-09-23 19:20:51 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:51.761077 | orchestrator | 2025-09-23 19:20:51 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:51.761110 | orchestrator | 2025-09-23 19:20:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:54.804404 | orchestrator | 2025-09-23 19:20:54 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:54.806388 | orchestrator | 2025-09-23 19:20:54 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:54.808349 | orchestrator | 2025-09-23 19:20:54 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:54.808593 | orchestrator | 2025-09-23 19:20:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:20:57.858683 | orchestrator | 2025-09-23 19:20:57 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:20:57.860013 | orchestrator | 2025-09-23 19:20:57 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:20:57.861424 | orchestrator | 2025-09-23 19:20:57 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:20:57.861530 | orchestrator | 2025-09-23 19:20:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:00.912572 | orchestrator | 2025-09-23 19:21:00 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:00.913344 | orchestrator | 2025-09-23 19:21:00 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:00.914842 | orchestrator | 2025-09-23 19:21:00 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:00.914871 | orchestrator | 2025-09-23 19:21:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:03.960892 | orchestrator | 2025-09-23 19:21:03 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:03.962254 | orchestrator | 2025-09-23 19:21:03 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:03.964247 | orchestrator | 2025-09-23 19:21:03 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:03.964280 | orchestrator | 2025-09-23 19:21:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:07.001880 | orchestrator | 2025-09-23 19:21:07 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:07.005452 | orchestrator | 2025-09-23 19:21:07 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:07.010139 | orchestrator | 2025-09-23 19:21:07 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:07.010204 | orchestrator | 2025-09-23 19:21:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:10.053649 | orchestrator | 2025-09-23 19:21:10 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:10.056467 | orchestrator | 2025-09-23 19:21:10 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:10.056708 | orchestrator | 2025-09-23 19:21:10 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:10.056814 | orchestrator | 2025-09-23 19:21:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:13.097032 | orchestrator | 2025-09-23 19:21:13 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:13.097114 | orchestrator | 2025-09-23 19:21:13 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:13.098594 | orchestrator | 2025-09-23 19:21:13 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:13.098624 | orchestrator | 2025-09-23 19:21:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:16.146663 | orchestrator | 2025-09-23 19:21:16 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:16.149046 | orchestrator | 2025-09-23 19:21:16 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:16.150806 | orchestrator | 2025-09-23 19:21:16 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:16.151812 | orchestrator | 2025-09-23 19:21:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:19.203678 | orchestrator | 2025-09-23 19:21:19 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:19.205245 | orchestrator | 2025-09-23 19:21:19 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:19.206755 | orchestrator | 2025-09-23 19:21:19 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:19.206786 | orchestrator | 2025-09-23 19:21:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:22.251123 | orchestrator | 2025-09-23 19:21:22 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:22.252894 | orchestrator | 2025-09-23 19:21:22 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:22.255170 | orchestrator | 2025-09-23 19:21:22 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:22.255316 | orchestrator | 2025-09-23 19:21:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:25.297000 | orchestrator | 2025-09-23 19:21:25 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:25.297670 | orchestrator | 2025-09-23 19:21:25 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:25.299114 | orchestrator | 2025-09-23 19:21:25 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:25.299656 | orchestrator | 2025-09-23 19:21:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:28.347237 | orchestrator | 2025-09-23 19:21:28 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:28.351531 | orchestrator | 2025-09-23 19:21:28 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:28.354579 | orchestrator | 2025-09-23 19:21:28 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:28.354628 | orchestrator | 2025-09-23 19:21:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:31.390910 | orchestrator | 2025-09-23 19:21:31 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:31.392765 | orchestrator | 2025-09-23 19:21:31 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:31.394616 | orchestrator | 2025-09-23 19:21:31 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:31.394660 | orchestrator | 2025-09-23 19:21:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:34.442212 | orchestrator | 2025-09-23 19:21:34 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:34.443398 | orchestrator | 2025-09-23 19:21:34 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:34.445414 | orchestrator | 2025-09-23 19:21:34 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:34.445440 | orchestrator | 2025-09-23 19:21:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:37.487421 | orchestrator | 2025-09-23 19:21:37 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:37.489341 | orchestrator | 2025-09-23 19:21:37 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:37.492455 | orchestrator | 2025-09-23 19:21:37 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:37.492652 | orchestrator | 2025-09-23 19:21:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:40.537290 | orchestrator | 2025-09-23 19:21:40 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:40.537752 | orchestrator | 2025-09-23 19:21:40 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:40.539031 | orchestrator | 2025-09-23 19:21:40 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:40.539077 | orchestrator | 2025-09-23 19:21:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:43.580858 | orchestrator | 2025-09-23 19:21:43 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:43.582200 | orchestrator | 2025-09-23 19:21:43 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:43.584477 | orchestrator | 2025-09-23 19:21:43 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:43.584777 | orchestrator | 2025-09-23 19:21:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:46.616747 | orchestrator | 2025-09-23 19:21:46 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:46.619032 | orchestrator | 2025-09-23 19:21:46 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:46.619443 | orchestrator | 2025-09-23 19:21:46 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:46.620973 | orchestrator | 2025-09-23 19:21:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:49.665153 | orchestrator | 2025-09-23 19:21:49 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:49.666267 | orchestrator | 2025-09-23 19:21:49 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:49.667733 | orchestrator | 2025-09-23 19:21:49 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:49.667757 | orchestrator | 2025-09-23 19:21:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:52.719537 | orchestrator | 2025-09-23 19:21:52 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:52.720863 | orchestrator | 2025-09-23 19:21:52 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:52.722350 | orchestrator | 2025-09-23 19:21:52 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:52.722375 | orchestrator | 2025-09-23 19:21:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:55.768724 | orchestrator | 2025-09-23 19:21:55 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:55.770283 | orchestrator | 2025-09-23 19:21:55 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:55.772334 | orchestrator | 2025-09-23 19:21:55 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:55.772477 | orchestrator | 2025-09-23 19:21:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:21:58.813681 | orchestrator | 2025-09-23 19:21:58 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:21:58.816001 | orchestrator | 2025-09-23 19:21:58 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:21:58.818279 | orchestrator | 2025-09-23 19:21:58 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:21:58.818404 | orchestrator | 2025-09-23 19:21:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:01.859020 | orchestrator | 2025-09-23 19:22:01 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:01.860894 | orchestrator | 2025-09-23 19:22:01 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:01.862730 | orchestrator | 2025-09-23 19:22:01 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:01.862829 | orchestrator | 2025-09-23 19:22:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:04.904545 | orchestrator | 2025-09-23 19:22:04 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:04.904955 | orchestrator | 2025-09-23 19:22:04 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:04.907188 | orchestrator | 2025-09-23 19:22:04 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:04.907216 | orchestrator | 2025-09-23 19:22:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:07.955363 | orchestrator | 2025-09-23 19:22:07 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:07.956547 | orchestrator | 2025-09-23 19:22:07 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:07.957987 | orchestrator | 2025-09-23 19:22:07 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:07.958175 | orchestrator | 2025-09-23 19:22:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:11.014571 | orchestrator | 2025-09-23 19:22:11 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:11.015493 | orchestrator | 2025-09-23 19:22:11 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:11.017415 | orchestrator | 2025-09-23 19:22:11 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:11.017460 | orchestrator | 2025-09-23 19:22:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:14.056576 | orchestrator | 2025-09-23 19:22:14 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:14.058133 | orchestrator | 2025-09-23 19:22:14 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:14.059518 | orchestrator | 2025-09-23 19:22:14 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:14.059557 | orchestrator | 2025-09-23 19:22:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:17.106380 | orchestrator | 2025-09-23 19:22:17 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:17.108026 | orchestrator | 2025-09-23 19:22:17 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:17.109615 | orchestrator | 2025-09-23 19:22:17 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:17.109988 | orchestrator | 2025-09-23 19:22:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:20.150746 | orchestrator | 2025-09-23 19:22:20 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state STARTED 2025-09-23 19:22:20.154358 | orchestrator | 2025-09-23 19:22:20 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:20.157456 | orchestrator | 2025-09-23 19:22:20 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:20.157518 | orchestrator | 2025-09-23 19:22:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:23.210433 | orchestrator | 2025-09-23 19:22:23 | INFO  | Task c7c6c4c9-94f5-4649-8f16-9afaa60f5d4f is in state SUCCESS 2025-09-23 19:22:23.211549 | orchestrator | 2025-09-23 19:22:23.211583 | orchestrator | 2025-09-23 19:22:23.211594 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:22:23.211603 | orchestrator | 2025-09-23 19:22:23.211612 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:22:23.211621 | orchestrator | Tuesday 23 September 2025 19:18:32 +0000 (0:00:00.278) 0:00:00.278 ***** 2025-09-23 19:22:23.211656 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:23.211667 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:23.211690 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:23.211708 | orchestrator | 2025-09-23 19:22:23.211717 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:22:23.211726 | orchestrator | Tuesday 23 September 2025 19:18:32 +0000 (0:00:00.284) 0:00:00.563 ***** 2025-09-23 19:22:23.211736 | orchestrator | ok: [testbed-node-0] => (item=enable_opensearch_True) 2025-09-23 19:22:23.211745 | orchestrator | ok: [testbed-node-1] => (item=enable_opensearch_True) 2025-09-23 19:22:23.211811 | orchestrator | ok: [testbed-node-2] => (item=enable_opensearch_True) 2025-09-23 19:22:23.211823 | orchestrator | 2025-09-23 19:22:23.211832 | orchestrator | PLAY [Apply role opensearch] *************************************************** 2025-09-23 19:22:23.211841 | orchestrator | 2025-09-23 19:22:23.211851 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-23 19:22:23.211860 | orchestrator | Tuesday 23 September 2025 19:18:33 +0000 (0:00:00.421) 0:00:00.984 ***** 2025-09-23 19:22:23.211870 | orchestrator | included: /ansible/roles/opensearch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:23.211880 | orchestrator | 2025-09-23 19:22:23.211889 | orchestrator | TASK [opensearch : Setting sysctl values] ************************************** 2025-09-23 19:22:23.211899 | orchestrator | Tuesday 23 September 2025 19:18:33 +0000 (0:00:00.488) 0:00:01.473 ***** 2025-09-23 19:22:23.211908 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-23 19:22:23.211940 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-23 19:22:23.211950 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-23 19:22:23.211959 | orchestrator | 2025-09-23 19:22:23.211967 | orchestrator | TASK [opensearch : Ensuring config directories exist] ************************** 2025-09-23 19:22:23.211976 | orchestrator | Tuesday 23 September 2025 19:18:35 +0000 (0:00:01.653) 0:00:03.126 ***** 2025-09-23 19:22:23.211988 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212001 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212031 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212052 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212064 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212096 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212106 | orchestrator | 2025-09-23 19:22:23.212115 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-23 19:22:23.212123 | orchestrator | Tuesday 23 September 2025 19:18:37 +0000 (0:00:02.143) 0:00:05.270 ***** 2025-09-23 19:22:23.212132 | orchestrator | included: /ansible/roles/opensearch/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:23.212141 | orchestrator | 2025-09-23 19:22:23.212149 | orchestrator | TASK [service-cert-copy : opensearch | Copying over extra CA certificates] ***** 2025-09-23 19:22:23.212158 | orchestrator | Tuesday 23 September 2025 19:18:37 +0000 (0:00:00.599) 0:00:05.870 ***** 2025-09-23 19:22:23.212179 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212196 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212206 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212216 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212235 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212251 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212261 | orchestrator | 2025-09-23 19:22:23.212270 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS certificate] *** 2025-09-23 19:22:23.212279 | orchestrator | Tuesday 23 September 2025 19:18:41 +0000 (0:00:03.242) 0:00:09.112 ***** 2025-09-23 19:22:23.212288 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:22:23.212297 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:22:23.212307 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:23.212316 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:22:23.212341 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:22:23.212351 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:22:23.212361 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:22:23.212370 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:23.212379 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:23.212388 | orchestrator | 2025-09-23 19:22:23.212397 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS key] *** 2025-09-23 19:22:23.212405 | orchestrator | Tuesday 23 September 2025 19:18:42 +0000 (0:00:00.911) 0:00:10.023 ***** 2025-09-23 19:22:23.212414 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:22:23.212460 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:22:23.212479 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:23.212494 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:22:23.212510 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:22:23.212525 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:23.212540 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-23 19:22:23.212582 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-23 19:22:23.212602 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:23.212618 | orchestrator | 2025-09-23 19:22:23.212633 | orchestrator | TASK [opensearch : Copying over config.json files for services] **************** 2025-09-23 19:22:23.212649 | orchestrator | Tuesday 23 September 2025 19:18:43 +0000 (0:00:01.252) 0:00:11.276 ***** 2025-09-23 19:22:23.212664 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212681 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212698 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.212740 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212760 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212778 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.212795 | orchestrator | 2025-09-23 19:22:23.212811 | orchestrator | TASK [opensearch : Copying over opensearch service config file] **************** 2025-09-23 19:22:23.212827 | orchestrator | Tuesday 23 September 2025 19:18:45 +0000 (0:00:02.613) 0:00:13.889 ***** 2025-09-23 19:22:23.212851 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:23.212866 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:23.212880 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:23.212894 | orchestrator | 2025-09-23 19:22:23.212910 | orchestrator | TASK [opensearch : Copying over opensearch-dashboards config file] ************* 2025-09-23 19:22:23.212925 | orchestrator | Tuesday 23 September 2025 19:18:48 +0000 (0:00:02.525) 0:00:16.415 ***** 2025-09-23 19:22:23.212940 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:23.212954 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:23.212970 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:23.212985 | orchestrator | 2025-09-23 19:22:23.213000 | orchestrator | TASK [opensearch : Check opensearch containers] ******************************** 2025-09-23 19:22:23.213017 | orchestrator | Tuesday 23 September 2025 19:18:50 +0000 (0:00:02.040) 0:00:18.455 ***** 2025-09-23 19:22:23.213033 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.213218 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.213233 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-23 19:22:23.213244 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.213262 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.213283 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-23 19:22:23.213293 | orchestrator | 2025-09-23 19:22:23.213302 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-23 19:22:23.213311 | orchestrator | Tuesday 23 September 2025 19:18:52 +0000 (0:00:02.147) 0:00:20.603 ***** 2025-09-23 19:22:23.213320 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:23.213328 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:23.213337 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:23.213346 | orchestrator | 2025-09-23 19:22:23.213354 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2025-09-23 19:22:23.213363 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:00.296) 0:00:20.900 ***** 2025-09-23 19:22:23.213371 | orchestrator | 2025-09-23 19:22:23.213425 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2025-09-23 19:22:23.213436 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:00.063) 0:00:20.963 ***** 2025-09-23 19:22:23.213445 | orchestrator | 2025-09-23 19:22:23.213453 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2025-09-23 19:22:23.213462 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:00.067) 0:00:21.030 ***** 2025-09-23 19:22:23.213470 | orchestrator | 2025-09-23 19:22:23.213479 | orchestrator | RUNNING HANDLER [opensearch : Disable shard allocation] ************************ 2025-09-23 19:22:23.213487 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:00.067) 0:00:21.098 ***** 2025-09-23 19:22:23.213502 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:23.213511 | orchestrator | 2025-09-23 19:22:23.213520 | orchestrator | RUNNING HANDLER [opensearch : Perform a flush] ********************************* 2025-09-23 19:22:23.213528 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:00.232) 0:00:21.330 ***** 2025-09-23 19:22:23.213537 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:23.213545 | orchestrator | 2025-09-23 19:22:23.213554 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch container] ******************** 2025-09-23 19:22:23.213562 | orchestrator | Tuesday 23 September 2025 19:18:54 +0000 (0:00:00.643) 0:00:21.974 ***** 2025-09-23 19:22:23.213571 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:23.213580 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:23.213588 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:23.213597 | orchestrator | 2025-09-23 19:22:23.213605 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch-dashboards container] ********* 2025-09-23 19:22:23.213614 | orchestrator | Tuesday 23 September 2025 19:19:50 +0000 (0:00:56.423) 0:01:18.398 ***** 2025-09-23 19:22:23.213622 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:23.213631 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:23.213639 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:23.213648 | orchestrator | 2025-09-23 19:22:23.213656 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-23 19:22:23.213665 | orchestrator | Tuesday 23 September 2025 19:21:04 +0000 (0:01:13.916) 0:02:32.315 ***** 2025-09-23 19:22:23.213674 | orchestrator | included: /ansible/roles/opensearch/tasks/post-config.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:23.213682 | orchestrator | 2025-09-23 19:22:23.213691 | orchestrator | TASK [opensearch : Wait for OpenSearch to become ready] ************************ 2025-09-23 19:22:23.213700 | orchestrator | Tuesday 23 September 2025 19:21:04 +0000 (0:00:00.507) 0:02:32.823 ***** 2025-09-23 19:22:23.213708 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (30 retries left). 2025-09-23 19:22:23.213719 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (29 retries left). 2025-09-23 19:22:23.213727 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (28 retries left). 2025-09-23 19:22:23.213735 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (27 retries left). 2025-09-23 19:22:23.213744 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (26 retries left). 2025-09-23 19:22:23.213753 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (25 retries left). 2025-09-23 19:22:23.213761 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (24 retries left). 2025-09-23 19:22:23.213770 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (23 retries left). 2025-09-23 19:22:23.213778 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (22 retries left). 2025-09-23 19:22:23.213787 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (21 retries left). 2025-09-23 19:22:23.213795 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (20 retries left). 2025-09-23 19:22:23.213804 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (19 retries left). 2025-09-23 19:22:23.213822 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (18 retries left). 2025-09-23 19:22:23.213834 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (17 retries left). 2025-09-23 19:22:23.213843 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (16 retries left). 2025-09-23 19:22:23.213853 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (15 retries left). 2025-09-23 19:22:23.213862 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (14 retries left). 2025-09-23 19:22:23.213877 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (13 retries left). 2025-09-23 19:22:23.213887 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (12 retries left). 2025-09-23 19:22:23.213896 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (11 retries left). 2025-09-23 19:22:23.213906 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (10 retries left). 2025-09-23 19:22:23.213915 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (9 retries left). 2025-09-23 19:22:23.213925 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (8 retries left). 2025-09-23 19:22:23.213934 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (7 retries left). 2025-09-23 19:22:23.213944 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (6 retries left). 2025-09-23 19:22:23.213953 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (5 retries left). 2025-09-23 19:22:23.213963 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (4 retries left). 2025-09-23 19:22:23.213973 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (3 retries left). 2025-09-23 19:22:23.213982 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (2 retries left). 2025-09-23 19:22:23.213992 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (1 retries left). 2025-09-23 19:22:23.214002 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"attempts": 30, "changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:22:23.214012 | orchestrator | 2025-09-23 19:22:23.214095 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:22:23.214106 | orchestrator | testbed-node-0 : ok=14  changed=9  unreachable=0 failed=1  skipped=5  rescued=0 ignored=0 2025-09-23 19:22:23.214117 | orchestrator | testbed-node-1 : ok=14  changed=9  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-23 19:22:23.214127 | orchestrator | testbed-node-2 : ok=14  changed=9  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-23 19:22:23.214135 | orchestrator | 2025-09-23 19:22:23.214144 | orchestrator | 2025-09-23 19:22:23.214153 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:22:23.214162 | orchestrator | Tuesday 23 September 2025 19:22:20 +0000 (0:01:15.668) 0:03:48.491 ***** 2025-09-23 19:22:23.214171 | orchestrator | =============================================================================== 2025-09-23 19:22:23.214179 | orchestrator | opensearch : Wait for OpenSearch to become ready ----------------------- 75.67s 2025-09-23 19:22:23.214221 | orchestrator | opensearch : Restart opensearch-dashboards container ------------------- 73.92s 2025-09-23 19:22:23.214230 | orchestrator | opensearch : Restart opensearch container ------------------------------ 56.42s 2025-09-23 19:22:23.214239 | orchestrator | service-cert-copy : opensearch | Copying over extra CA certificates ----- 3.24s 2025-09-23 19:22:23.214247 | orchestrator | opensearch : Copying over config.json files for services ---------------- 2.61s 2025-09-23 19:22:23.214256 | orchestrator | opensearch : Copying over opensearch service config file ---------------- 2.53s 2025-09-23 19:22:23.214265 | orchestrator | opensearch : Check opensearch containers -------------------------------- 2.15s 2025-09-23 19:22:23.214273 | orchestrator | opensearch : Ensuring config directories exist -------------------------- 2.14s 2025-09-23 19:22:23.214282 | orchestrator | opensearch : Copying over opensearch-dashboards config file ------------- 2.04s 2025-09-23 19:22:23.214290 | orchestrator | opensearch : Setting sysctl values -------------------------------------- 1.65s 2025-09-23 19:22:23.214305 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS key --- 1.25s 2025-09-23 19:22:23.214314 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS certificate --- 0.91s 2025-09-23 19:22:23.214323 | orchestrator | opensearch : Perform a flush -------------------------------------------- 0.64s 2025-09-23 19:22:23.214332 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.60s 2025-09-23 19:22:23.214340 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.51s 2025-09-23 19:22:23.214349 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.49s 2025-09-23 19:22:23.214368 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.42s 2025-09-23 19:22:23.214378 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.30s 2025-09-23 19:22:23.214386 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.29s 2025-09-23 19:22:23.214395 | orchestrator | opensearch : Disable shard allocation ----------------------------------- 0.23s 2025-09-23 19:22:23.214403 | orchestrator | 2025-09-23 19:22:23 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:23.216183 | orchestrator | 2025-09-23 19:22:23 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:23.216269 | orchestrator | 2025-09-23 19:22:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:26.257473 | orchestrator | 2025-09-23 19:22:26 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:26.259771 | orchestrator | 2025-09-23 19:22:26 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:26.260058 | orchestrator | 2025-09-23 19:22:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:29.308754 | orchestrator | 2025-09-23 19:22:29 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:29.311295 | orchestrator | 2025-09-23 19:22:29 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:29.311384 | orchestrator | 2025-09-23 19:22:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:32.355514 | orchestrator | 2025-09-23 19:22:32 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:32.357387 | orchestrator | 2025-09-23 19:22:32 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:32.357551 | orchestrator | 2025-09-23 19:22:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:35.402185 | orchestrator | 2025-09-23 19:22:35 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:35.404118 | orchestrator | 2025-09-23 19:22:35 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:35.404156 | orchestrator | 2025-09-23 19:22:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:38.455635 | orchestrator | 2025-09-23 19:22:38 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:38.457980 | orchestrator | 2025-09-23 19:22:38 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:38.458193 | orchestrator | 2025-09-23 19:22:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:41.499506 | orchestrator | 2025-09-23 19:22:41 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:41.501368 | orchestrator | 2025-09-23 19:22:41 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:41.501414 | orchestrator | 2025-09-23 19:22:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:44.543899 | orchestrator | 2025-09-23 19:22:44 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:44.546359 | orchestrator | 2025-09-23 19:22:44 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:44.546411 | orchestrator | 2025-09-23 19:22:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:47.593552 | orchestrator | 2025-09-23 19:22:47 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:47.593656 | orchestrator | 2025-09-23 19:22:47 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:47.593671 | orchestrator | 2025-09-23 19:22:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:50.630819 | orchestrator | 2025-09-23 19:22:50 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:50.632333 | orchestrator | 2025-09-23 19:22:50 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:50.632374 | orchestrator | 2025-09-23 19:22:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:53.680200 | orchestrator | 2025-09-23 19:22:53 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:53.681462 | orchestrator | 2025-09-23 19:22:53 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state STARTED 2025-09-23 19:22:53.681596 | orchestrator | 2025-09-23 19:22:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:56.726588 | orchestrator | 2025-09-23 19:22:56 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state STARTED 2025-09-23 19:22:56.729101 | orchestrator | 2025-09-23 19:22:56 | INFO  | Task 12812ee4-1b96-413d-9fa8-c8c3decc85b3 is in state SUCCESS 2025-09-23 19:22:56.729142 | orchestrator | 2025-09-23 19:22:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:56.731476 | orchestrator | 2025-09-23 19:22:56.731502 | orchestrator | 2025-09-23 19:22:56.731508 | orchestrator | PLAY [Set kolla_action_mariadb] ************************************************ 2025-09-23 19:22:56.731549 | orchestrator | 2025-09-23 19:22:56.731556 | orchestrator | TASK [Inform the user about the following task] ******************************** 2025-09-23 19:22:56.731562 | orchestrator | Tuesday 23 September 2025 19:18:32 +0000 (0:00:00.102) 0:00:00.102 ***** 2025-09-23 19:22:56.731567 | orchestrator | ok: [localhost] => { 2025-09-23 19:22:56.731574 | orchestrator |  "msg": "The task 'Check MariaDB service' fails if the MariaDB service has not yet been deployed. This is fine." 2025-09-23 19:22:56.731580 | orchestrator | } 2025-09-23 19:22:56.731586 | orchestrator | 2025-09-23 19:22:56.731592 | orchestrator | TASK [Check MariaDB service] *************************************************** 2025-09-23 19:22:56.731597 | orchestrator | Tuesday 23 September 2025 19:18:32 +0000 (0:00:00.048) 0:00:00.151 ***** 2025-09-23 19:22:56.731603 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.9:3306"} 2025-09-23 19:22:56.731609 | orchestrator | ...ignoring 2025-09-23 19:22:56.731615 | orchestrator | 2025-09-23 19:22:56.731620 | orchestrator | TASK [Set kolla_action_mariadb = upgrade if MariaDB is already running] ******** 2025-09-23 19:22:56.731625 | orchestrator | Tuesday 23 September 2025 19:18:35 +0000 (0:00:02.847) 0:00:02.998 ***** 2025-09-23 19:22:56.731630 | orchestrator | skipping: [localhost] 2025-09-23 19:22:56.731636 | orchestrator | 2025-09-23 19:22:56.731641 | orchestrator | TASK [Set kolla_action_mariadb = kolla_action_ng] ****************************** 2025-09-23 19:22:56.731646 | orchestrator | Tuesday 23 September 2025 19:18:35 +0000 (0:00:00.049) 0:00:03.047 ***** 2025-09-23 19:22:56.731651 | orchestrator | ok: [localhost] 2025-09-23 19:22:56.731657 | orchestrator | 2025-09-23 19:22:56.731662 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:22:56.731667 | orchestrator | 2025-09-23 19:22:56.731690 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:22:56.731696 | orchestrator | Tuesday 23 September 2025 19:18:35 +0000 (0:00:00.146) 0:00:03.193 ***** 2025-09-23 19:22:56.731701 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.731706 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.731711 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.731716 | orchestrator | 2025-09-23 19:22:56.731721 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:22:56.731726 | orchestrator | Tuesday 23 September 2025 19:18:35 +0000 (0:00:00.491) 0:00:03.685 ***** 2025-09-23 19:22:56.731740 | orchestrator | ok: [testbed-node-0] => (item=enable_mariadb_True) 2025-09-23 19:22:56.731747 | orchestrator | ok: [testbed-node-1] => (item=enable_mariadb_True) 2025-09-23 19:22:56.731752 | orchestrator | ok: [testbed-node-2] => (item=enable_mariadb_True) 2025-09-23 19:22:56.731757 | orchestrator | 2025-09-23 19:22:56.731762 | orchestrator | PLAY [Apply role mariadb] ****************************************************** 2025-09-23 19:22:56.731767 | orchestrator | 2025-09-23 19:22:56.731772 | orchestrator | TASK [mariadb : Group MariaDB hosts based on shards] *************************** 2025-09-23 19:22:56.731777 | orchestrator | Tuesday 23 September 2025 19:18:36 +0000 (0:00:00.695) 0:00:04.381 ***** 2025-09-23 19:22:56.731782 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-23 19:22:56.731787 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2025-09-23 19:22:56.731792 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2025-09-23 19:22:56.731797 | orchestrator | 2025-09-23 19:22:56.731802 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-23 19:22:56.731807 | orchestrator | Tuesday 23 September 2025 19:18:37 +0000 (0:00:00.479) 0:00:04.860 ***** 2025-09-23 19:22:56.731812 | orchestrator | included: /ansible/roles/mariadb/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:56.731819 | orchestrator | 2025-09-23 19:22:56.731824 | orchestrator | TASK [mariadb : Ensuring config directories exist] ***************************** 2025-09-23 19:22:56.731829 | orchestrator | Tuesday 23 September 2025 19:18:37 +0000 (0:00:00.510) 0:00:05.371 ***** 2025-09-23 19:22:56.731854 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.731863 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.731881 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.731887 | orchestrator | 2025-09-23 19:22:56.731895 | orchestrator | TASK [mariadb : Ensuring database backup config directory exists] ************** 2025-09-23 19:22:56.731900 | orchestrator | Tuesday 23 September 2025 19:18:41 +0000 (0:00:03.734) 0:00:09.105 ***** 2025-09-23 19:22:56.731905 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.731911 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.731916 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.731925 | orchestrator | 2025-09-23 19:22:56.731931 | orchestrator | TASK [mariadb : Copying over my.cnf for mariabackup] *************************** 2025-09-23 19:22:56.731936 | orchestrator | Tuesday 23 September 2025 19:18:42 +0000 (0:00:00.812) 0:00:09.918 ***** 2025-09-23 19:22:56.731941 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.731946 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.731951 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.731956 | orchestrator | 2025-09-23 19:22:56.731961 | orchestrator | TASK [mariadb : Copying over config.json files for services] ******************* 2025-09-23 19:22:56.731966 | orchestrator | Tuesday 23 September 2025 19:18:43 +0000 (0:00:01.628) 0:00:11.547 ***** 2025-09-23 19:22:56.731972 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.731984 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.731995 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.732001 | orchestrator | 2025-09-23 19:22:56.732006 | orchestrator | TASK [mariadb : Copying over config.json files for mariabackup] **************** 2025-09-23 19:22:56.732011 | orchestrator | Tuesday 23 September 2025 19:18:47 +0000 (0:00:03.770) 0:00:15.317 ***** 2025-09-23 19:22:56.732016 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732021 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732026 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.732031 | orchestrator | 2025-09-23 19:22:56.732036 | orchestrator | TASK [mariadb : Copying over galera.cnf] *************************************** 2025-09-23 19:22:56.732041 | orchestrator | Tuesday 23 September 2025 19:18:48 +0000 (0:00:01.247) 0:00:16.565 ***** 2025-09-23 19:22:56.732046 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.732051 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:56.732056 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:56.732061 | orchestrator | 2025-09-23 19:22:56.732066 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-23 19:22:56.732093 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:04.559) 0:00:21.125 ***** 2025-09-23 19:22:56.732099 | orchestrator | included: /ansible/roles/mariadb/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:56.732105 | orchestrator | 2025-09-23 19:22:56.732111 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2025-09-23 19:22:56.732116 | orchestrator | Tuesday 23 September 2025 19:18:53 +0000 (0:00:00.509) 0:00:21.634 ***** 2025-09-23 19:22:56.732131 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732142 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732148 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732155 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732169 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732179 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732185 | orchestrator | 2025-09-23 19:22:56.732191 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2025-09-23 19:22:56.732196 | orchestrator | Tuesday 23 September 2025 19:18:57 +0000 (0:00:03.680) 0:00:25.315 ***** 2025-09-23 19:22:56.732203 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732210 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732220 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732231 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732289 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732301 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732307 | orchestrator | 2025-09-23 19:22:56.732313 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2025-09-23 19:22:56.732319 | orchestrator | Tuesday 23 September 2025 19:19:00 +0000 (0:00:02.513) 0:00:27.829 ***** 2025-09-23 19:22:56.732328 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732339 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732350 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732357 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732364 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-23 19:22:56.732374 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732380 | orchestrator | 2025-09-23 19:22:56.732386 | orchestrator | TASK [mariadb : Check mariadb containers] ************************************** 2025-09-23 19:22:56.732391 | orchestrator | Tuesday 23 September 2025 19:19:02 +0000 (0:00:02.938) 0:00:30.768 ***** 2025-09-23 19:22:56.732509 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.732517 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.732534 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-23 19:22:56.732540 | orchestrator | 2025-09-23 19:22:56.732545 | orchestrator | TASK [mariadb : Create MariaDB volume] ***************************************** 2025-09-23 19:22:56.732550 | orchestrator | Tuesday 23 September 2025 19:19:06 +0000 (0:00:03.791) 0:00:34.560 ***** 2025-09-23 19:22:56.732556 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.732561 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:56.732566 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:56.732571 | orchestrator | 2025-09-23 19:22:56.732576 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB volume availability] ************* 2025-09-23 19:22:56.732581 | orchestrator | Tuesday 23 September 2025 19:19:07 +0000 (0:00:00.884) 0:00:35.445 ***** 2025-09-23 19:22:56.732586 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.732591 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.732596 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.732601 | orchestrator | 2025-09-23 19:22:56.732607 | orchestrator | TASK [mariadb : Establish whether the cluster has already existed] ************* 2025-09-23 19:22:56.732612 | orchestrator | Tuesday 23 September 2025 19:19:08 +0000 (0:00:00.974) 0:00:36.419 ***** 2025-09-23 19:22:56.732617 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.732622 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.732627 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.732632 | orchestrator | 2025-09-23 19:22:56.732637 | orchestrator | TASK [mariadb : Check MariaDB service port liveness] *************************** 2025-09-23 19:22:56.732642 | orchestrator | Tuesday 23 September 2025 19:19:09 +0000 (0:00:00.516) 0:00:36.936 ***** 2025-09-23 19:22:56.732652 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.10:3306"} 2025-09-23 19:22:56.732657 | orchestrator | ...ignoring 2025-09-23 19:22:56.732662 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.12:3306"} 2025-09-23 19:22:56.732668 | orchestrator | ...ignoring 2025-09-23 19:22:56.732673 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.11:3306"} 2025-09-23 19:22:56.732678 | orchestrator | ...ignoring 2025-09-23 19:22:56.732683 | orchestrator | 2025-09-23 19:22:56.732688 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service port liveness] *********** 2025-09-23 19:22:56.732693 | orchestrator | Tuesday 23 September 2025 19:19:20 +0000 (0:00:11.158) 0:00:48.094 ***** 2025-09-23 19:22:56.732699 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.732704 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.732709 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.732714 | orchestrator | 2025-09-23 19:22:56.732719 | orchestrator | TASK [mariadb : Fail on existing but stopped cluster] ************************** 2025-09-23 19:22:56.732724 | orchestrator | Tuesday 23 September 2025 19:19:20 +0000 (0:00:00.461) 0:00:48.556 ***** 2025-09-23 19:22:56.732729 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732734 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732739 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732744 | orchestrator | 2025-09-23 19:22:56.732749 | orchestrator | TASK [mariadb : Check MariaDB service WSREP sync status] *********************** 2025-09-23 19:22:56.732754 | orchestrator | Tuesday 23 September 2025 19:19:21 +0000 (0:00:00.636) 0:00:49.192 ***** 2025-09-23 19:22:56.732759 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732764 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732769 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732774 | orchestrator | 2025-09-23 19:22:56.732779 | orchestrator | TASK [mariadb : Extract MariaDB service WSREP sync status] ********************* 2025-09-23 19:22:56.732785 | orchestrator | Tuesday 23 September 2025 19:19:21 +0000 (0:00:00.410) 0:00:49.602 ***** 2025-09-23 19:22:56.732790 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732795 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732800 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732805 | orchestrator | 2025-09-23 19:22:56.732810 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service WSREP sync status] ******* 2025-09-23 19:22:56.732815 | orchestrator | Tuesday 23 September 2025 19:19:22 +0000 (0:00:00.455) 0:00:50.058 ***** 2025-09-23 19:22:56.732820 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.732825 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.732833 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.732838 | orchestrator | 2025-09-23 19:22:56.732844 | orchestrator | TASK [mariadb : Fail when MariaDB services are not synced across the whole cluster] *** 2025-09-23 19:22:56.732849 | orchestrator | Tuesday 23 September 2025 19:19:22 +0000 (0:00:00.413) 0:00:50.472 ***** 2025-09-23 19:22:56.732856 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732862 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732867 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732872 | orchestrator | 2025-09-23 19:22:56.732877 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-23 19:22:56.732882 | orchestrator | Tuesday 23 September 2025 19:19:23 +0000 (0:00:00.615) 0:00:51.087 ***** 2025-09-23 19:22:56.732887 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732892 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732897 | orchestrator | included: /ansible/roles/mariadb/tasks/bootstrap_cluster.yml for testbed-node-0 2025-09-23 19:22:56.732902 | orchestrator | 2025-09-23 19:22:56.732907 | orchestrator | TASK [mariadb : Running MariaDB bootstrap container] *************************** 2025-09-23 19:22:56.732917 | orchestrator | Tuesday 23 September 2025 19:19:23 +0000 (0:00:00.386) 0:00:51.473 ***** 2025-09-23 19:22:56.732922 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.732927 | orchestrator | 2025-09-23 19:22:56.732932 | orchestrator | TASK [mariadb : Store bootstrap host name into facts] ************************** 2025-09-23 19:22:56.732937 | orchestrator | Tuesday 23 September 2025 19:19:33 +0000 (0:00:10.126) 0:01:01.600 ***** 2025-09-23 19:22:56.732942 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.732947 | orchestrator | 2025-09-23 19:22:56.732952 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-23 19:22:56.732957 | orchestrator | Tuesday 23 September 2025 19:19:33 +0000 (0:00:00.133) 0:01:01.733 ***** 2025-09-23 19:22:56.732963 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.732968 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.732973 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.732978 | orchestrator | 2025-09-23 19:22:56.732983 | orchestrator | RUNNING HANDLER [mariadb : Starting first MariaDB container] ******************* 2025-09-23 19:22:56.732988 | orchestrator | Tuesday 23 September 2025 19:19:34 +0000 (0:00:00.955) 0:01:02.688 ***** 2025-09-23 19:22:56.732995 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.733001 | orchestrator | 2025-09-23 19:22:56.733006 | orchestrator | RUNNING HANDLER [mariadb : Wait for first MariaDB service port liveness] ******* 2025-09-23 19:22:56.733011 | orchestrator | Tuesday 23 September 2025 19:19:42 +0000 (0:00:07.524) 0:01:10.212 ***** 2025-09-23 19:22:56.733016 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.733021 | orchestrator | 2025-09-23 19:22:56.733026 | orchestrator | RUNNING HANDLER [mariadb : Wait for first MariaDB service to sync WSREP] ******* 2025-09-23 19:22:56.733031 | orchestrator | Tuesday 23 September 2025 19:19:44 +0000 (0:00:01.637) 0:01:11.850 ***** 2025-09-23 19:22:56.733036 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.733041 | orchestrator | 2025-09-23 19:22:56.733046 | orchestrator | RUNNING HANDLER [mariadb : Ensure MariaDB is running normally on bootstrap host] *** 2025-09-23 19:22:56.733052 | orchestrator | Tuesday 23 September 2025 19:19:46 +0000 (0:00:02.468) 0:01:14.318 ***** 2025-09-23 19:22:56.733057 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.733062 | orchestrator | 2025-09-23 19:22:56.733067 | orchestrator | RUNNING HANDLER [mariadb : Restart MariaDB on existing cluster members] ******** 2025-09-23 19:22:56.733084 | orchestrator | Tuesday 23 September 2025 19:19:46 +0000 (0:00:00.130) 0:01:14.449 ***** 2025-09-23 19:22:56.733090 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.733095 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.733100 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.733105 | orchestrator | 2025-09-23 19:22:56.733110 | orchestrator | RUNNING HANDLER [mariadb : Start MariaDB on new nodes] ************************* 2025-09-23 19:22:56.733115 | orchestrator | Tuesday 23 September 2025 19:19:46 +0000 (0:00:00.308) 0:01:14.758 ***** 2025-09-23 19:22:56.733121 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:56.733126 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: mariadb_restart 2025-09-23 19:22:56.733131 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:56.733136 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:56.733141 | orchestrator | 2025-09-23 19:22:56.733147 | orchestrator | PLAY [Restart mariadb services] ************************************************ 2025-09-23 19:22:56.733153 | orchestrator | skipping: no hosts matched 2025-09-23 19:22:56.733159 | orchestrator | 2025-09-23 19:22:56.733165 | orchestrator | PLAY [Start mariadb services] ************************************************** 2025-09-23 19:22:56.733170 | orchestrator | 2025-09-23 19:22:56.733176 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2025-09-23 19:22:56.733182 | orchestrator | Tuesday 23 September 2025 19:19:47 +0000 (0:00:00.518) 0:01:15.277 ***** 2025-09-23 19:22:56.733187 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:56.733193 | orchestrator | 2025-09-23 19:22:56.733199 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2025-09-23 19:22:56.733204 | orchestrator | Tuesday 23 September 2025 19:20:06 +0000 (0:00:18.934) 0:01:34.211 ***** 2025-09-23 19:22:56.733214 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.733220 | orchestrator | 2025-09-23 19:22:56.733226 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2025-09-23 19:22:56.733232 | orchestrator | Tuesday 23 September 2025 19:20:26 +0000 (0:00:20.564) 0:01:54.776 ***** 2025-09-23 19:22:56.733237 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:56.733243 | orchestrator | 2025-09-23 19:22:56.733249 | orchestrator | PLAY [Start mariadb services] ************************************************** 2025-09-23 19:22:56.733255 | orchestrator | 2025-09-23 19:22:56.733260 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2025-09-23 19:22:56.733266 | orchestrator | Tuesday 23 September 2025 19:20:29 +0000 (0:00:02.404) 0:01:57.180 ***** 2025-09-23 19:22:56.733272 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:56.733277 | orchestrator | 2025-09-23 19:22:56.733283 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2025-09-23 19:22:56.733289 | orchestrator | Tuesday 23 September 2025 19:20:51 +0000 (0:00:22.256) 0:02:19.437 ***** 2025-09-23 19:22:56.733294 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.733300 | orchestrator | 2025-09-23 19:22:56.733306 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2025-09-23 19:22:56.733314 | orchestrator | Tuesday 23 September 2025 19:21:07 +0000 (0:00:15.546) 0:02:34.984 ***** 2025-09-23 19:22:56.733320 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:56.733326 | orchestrator | 2025-09-23 19:22:56.733332 | orchestrator | PLAY [Restart bootstrap mariadb service] *************************************** 2025-09-23 19:22:56.733337 | orchestrator | 2025-09-23 19:22:56.733345 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2025-09-23 19:22:56.733351 | orchestrator | Tuesday 23 September 2025 19:21:09 +0000 (0:00:02.475) 0:02:37.459 ***** 2025-09-23 19:22:56.733357 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:56.733363 | orchestrator | 2025-09-23 19:22:56.733369 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2025-09-23 19:22:56.733374 | orchestrator | Tuesday 23 September 2025 19:21:20 +0000 (0:00:11.062) 0:02:48.522 ***** 2025-09-23 19:22:56.733380 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.733385 | orchestrator | 2025-09-23 19:22:56.733391 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2025-09-23 19:22:56.733397 | orchestrator | Tuesday 23 September 2025 19:21:25 +0000 (0:00:04.624) 0:02:53.147 ***** 2025-09-23 19:22:56.733402 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:56.733408 | orchestrator | 2025-09-23 19:22:56.733414 | orchestrator | PLAY [Apply mariadb post-configuration] **************************************** 2025-09-23 19:22:56.733419 | orchestrator | 2025-09-23 19:22:56.733425 | orchestrator | TASK [Include mariadb post-deploy.yml] ***************************************** 2025-09-23 19:22:56.733431 | orchestrator | Tuesday 23 September 2025 19:21:27 +0000 (0:00:02.654) 0:02:55.802 ***** 2025-09-23 19:22:56.733436 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:56.733442 | orchestrator | 2025-09-23 19:22:56.733448 | orchestrator | TASK [mariadb : Creating shard root mysql user] ******************************** 2025-09-23 19:22:56.733453 | orchestrator | Tuesday 23 September 2025 19:21:28 +0000 (0:00:00.536) 0:02:56.338 ***** 2025-09-23 19:22:56.733459 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.733465 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.733471 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:22:56.733476 | orchestrator | 2025-09-23 19:22:56.733482 | orchestrator | TASK [mariadb : Creating mysql monitor user] *********************************** 2025-09-23 19:22:56.733488 | orchestrator | Tuesday 23 September 2025 19:21:29 +0000 (0:00:00.837) 0:02:57.176 ***** 2025-09-23 19:22:56.733494 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.733499 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.733505 | orchestrator | 2025-09-23 19:22:56.733511 | orchestrator | TASK [mariadb : Creating database backup user and setting permissions] ********* 2025-09-23 19:22:56.733520 | orchestrator | Tuesday 23 September 2025 19:21:29 +0000 (0:00:00.209) 0:02:57.385 ***** 2025-09-23 19:22:56.733525 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.733530 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.733535 | orchestrator | 2025-09-23 19:22:56.733540 | orchestrator | TASK [mariadb : Granting permissions on Mariabackup database to backup user] *** 2025-09-23 19:22:56.733546 | orchestrator | Tuesday 23 September 2025 19:21:29 +0000 (0:00:00.360) 0:02:57.745 ***** 2025-09-23 19:22:56.733551 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:56.733556 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:56.733561 | orchestrator | 2025-09-23 19:22:56.733566 | orchestrator | TASK [mariadb : Wait for MariaDB service to be ready through VIP] ************** 2025-09-23 19:22:56.733571 | orchestrator | Tuesday 23 September 2025 19:21:30 +0000 (0:00:00.217) 0:02:57.963 ***** 2025-09-23 19:22:56.733576 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (6 retries left). 2025-09-23 19:22:56.733582 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (6 retries left). 2025-09-23 19:22:56.733587 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (5 retries left). 2025-09-23 19:22:56.733592 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (5 retries left). 2025-09-23 19:22:56.733597 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (4 retries left). 2025-09-23 19:22:56.733602 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (4 retries left). 2025-09-23 19:22:56.733608 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (3 retries left). 2025-09-23 19:22:56.733613 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (3 retries left). 2025-09-23 19:22:56.733618 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (2 retries left). 2025-09-23 19:22:56.733623 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (2 retries left). 2025-09-23 19:22:56.733628 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (1 retries left). 2025-09-23 19:22:56.733633 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (1 retries left). 2025-09-23 19:22:56.733644 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"attempts": 6, "changed": false, "cmd": ["docker", "exec", "mariadb", "mysql", "-h", "api-int.testbed.osism.xyz", "-P", "3306", "-u", "root_shard_0", "-ppassword", "-e", "show databases;"], "delta": "0:00:03.209063", "end": "2025-09-23 19:22:50.513351", "msg": "non-zero return code", "rc": 1, "start": "2025-09-23 19:22:47.304288", "stderr": "ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)", "stderr_lines": ["ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)"], "stdout": "", "stdout_lines": []} 2025-09-23 19:22:56.733651 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"attempts": 6, "changed": false, "cmd": ["docker", "exec", "mariadb", "mysql", "-h", "api-int.testbed.osism.xyz", "-P", "3306", "-u", "root_shard_0", "-ppassword", "-e", "show databases;"], "delta": "0:00:03.228890", "end": "2025-09-23 19:22:56.136581", "msg": "non-zero return code", "rc": 1, "start": "2025-09-23 19:22:52.907691", "stderr": "ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)", "stderr_lines": ["ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)"], "stdout": "", "stdout_lines": []} 2025-09-23 19:22:56.733656 | orchestrator | 2025-09-23 19:22:56.733661 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:22:56.733667 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2025-09-23 19:22:56.733679 | orchestrator | testbed-node-0 : ok=29  changed=12  unreachable=0 failed=1  skipped=10  rescued=0 ignored=1  2025-09-23 19:22:56.733685 | orchestrator | testbed-node-1 : ok=19  changed=7  unreachable=0 failed=1  skipped=17  rescued=0 ignored=1  2025-09-23 19:22:56.733690 | orchestrator | testbed-node-2 : ok=19  changed=7  unreachable=0 failed=1  skipped=17  rescued=0 ignored=1  2025-09-23 19:22:56.733695 | orchestrator | 2025-09-23 19:22:56.733700 | orchestrator | 2025-09-23 19:22:56.733706 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:22:56.733711 | orchestrator | Tuesday 23 September 2025 19:22:56 +0000 (0:01:26.024) 0:04:23.988 ***** 2025-09-23 19:22:56.733716 | orchestrator | =============================================================================== 2025-09-23 19:22:56.733721 | orchestrator | mariadb : Wait for MariaDB service to be ready through VIP ------------- 86.02s 2025-09-23 19:22:56.733726 | orchestrator | mariadb : Restart MariaDB container ------------------------------------ 41.19s 2025-09-23 19:22:56.733731 | orchestrator | mariadb : Wait for MariaDB service port liveness ----------------------- 36.11s 2025-09-23 19:22:56.733737 | orchestrator | mariadb : Check MariaDB service port liveness -------------------------- 11.16s 2025-09-23 19:22:56.733742 | orchestrator | mariadb : Restart MariaDB container ------------------------------------ 11.06s 2025-09-23 19:22:56.733747 | orchestrator | mariadb : Running MariaDB bootstrap container -------------------------- 10.13s 2025-09-23 19:22:56.733752 | orchestrator | mariadb : Starting first MariaDB container ------------------------------ 7.52s 2025-09-23 19:22:56.733757 | orchestrator | mariadb : Wait for MariaDB service to sync WSREP ------------------------ 4.88s 2025-09-23 19:22:56.733762 | orchestrator | mariadb : Wait for MariaDB service port liveness ------------------------ 4.62s 2025-09-23 19:22:56.733767 | orchestrator | mariadb : Copying over galera.cnf --------------------------------------- 4.56s 2025-09-23 19:22:56.733772 | orchestrator | mariadb : Check mariadb containers -------------------------------------- 3.79s 2025-09-23 19:22:56.733777 | orchestrator | mariadb : Copying over config.json files for services ------------------- 3.77s 2025-09-23 19:22:56.733782 | orchestrator | mariadb : Ensuring config directories exist ----------------------------- 3.73s 2025-09-23 19:22:56.733788 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 3.68s 2025-09-23 19:22:56.733793 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS key ----- 2.94s 2025-09-23 19:22:56.733798 | orchestrator | Check MariaDB service --------------------------------------------------- 2.85s 2025-09-23 19:22:56.733803 | orchestrator | mariadb : Wait for MariaDB service to sync WSREP ------------------------ 2.65s 2025-09-23 19:22:56.733808 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS certificate --- 2.51s 2025-09-23 19:22:56.733813 | orchestrator | mariadb : Wait for first MariaDB service to sync WSREP ------------------ 2.47s 2025-09-23 19:22:56.733818 | orchestrator | mariadb : Wait for first MariaDB service port liveness ------------------ 1.64s 2025-09-23 19:22:59.771739 | orchestrator | 2025-09-23 19:22:59 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:22:59.773445 | orchestrator | 2025-09-23 19:22:59 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:22:59.780236 | orchestrator | 2025-09-23 19:22:59 | INFO  | Task 548a7c83-e510-4d38-9285-beff1f20fc74 is in state SUCCESS 2025-09-23 19:22:59.782355 | orchestrator | 2025-09-23 19:22:59.782689 | orchestrator | 2025-09-23 19:22:59.782707 | orchestrator | PLAY [Prepare deployment of Ceph services] ************************************* 2025-09-23 19:22:59.782748 | orchestrator | 2025-09-23 19:22:59.782761 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2025-09-23 19:22:59.782835 | orchestrator | Tuesday 23 September 2025 19:12:15 +0000 (0:00:00.729) 0:00:00.729 ***** 2025-09-23 19:22:59.782851 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.782863 | orchestrator | 2025-09-23 19:22:59.782940 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2025-09-23 19:22:59.782954 | orchestrator | Tuesday 23 September 2025 19:12:16 +0000 (0:00:00.962) 0:00:01.691 ***** 2025-09-23 19:22:59.782967 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.782980 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.782992 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.783004 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.783016 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.783028 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.783040 | orchestrator | 2025-09-23 19:22:59.783053 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2025-09-23 19:22:59.783065 | orchestrator | Tuesday 23 September 2025 19:12:17 +0000 (0:00:01.725) 0:00:03.417 ***** 2025-09-23 19:22:59.783077 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.783132 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.783145 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.783157 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.783169 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.783181 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.783193 | orchestrator | 2025-09-23 19:22:59.783205 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2025-09-23 19:22:59.783348 | orchestrator | Tuesday 23 September 2025 19:12:18 +0000 (0:00:00.802) 0:00:04.220 ***** 2025-09-23 19:22:59.783361 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.783373 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.783397 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.783408 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.783419 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.783429 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.783440 | orchestrator | 2025-09-23 19:22:59.783579 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2025-09-23 19:22:59.783601 | orchestrator | Tuesday 23 September 2025 19:12:19 +0000 (0:00:01.030) 0:00:05.251 ***** 2025-09-23 19:22:59.783620 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.783764 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.783776 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.783910 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.783923 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.783934 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.783945 | orchestrator | 2025-09-23 19:22:59.783955 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2025-09-23 19:22:59.784054 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:00.704) 0:00:05.955 ***** 2025-09-23 19:22:59.784066 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.784077 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.784213 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.784225 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.784236 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.784246 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.784257 | orchestrator | 2025-09-23 19:22:59.784268 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2025-09-23 19:22:59.784278 | orchestrator | Tuesday 23 September 2025 19:12:20 +0000 (0:00:00.493) 0:00:06.448 ***** 2025-09-23 19:22:59.784289 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.784300 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.784310 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.784321 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.784332 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.784342 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.784353 | orchestrator | 2025-09-23 19:22:59.784364 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2025-09-23 19:22:59.784389 | orchestrator | Tuesday 23 September 2025 19:12:21 +0000 (0:00:00.832) 0:00:07.281 ***** 2025-09-23 19:22:59.784400 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.784412 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.784423 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.784433 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.784444 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.784455 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.784465 | orchestrator | 2025-09-23 19:22:59.784476 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2025-09-23 19:22:59.784487 | orchestrator | Tuesday 23 September 2025 19:12:22 +0000 (0:00:01.188) 0:00:08.470 ***** 2025-09-23 19:22:59.784498 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.784509 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.784520 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.784530 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.784541 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.784559 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.784579 | orchestrator | 2025-09-23 19:22:59.784597 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2025-09-23 19:22:59.784617 | orchestrator | Tuesday 23 September 2025 19:12:23 +0000 (0:00:00.913) 0:00:09.383 ***** 2025-09-23 19:22:59.784637 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:22:59.784657 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:22:59.784671 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:22:59.784681 | orchestrator | 2025-09-23 19:22:59.784692 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2025-09-23 19:22:59.784703 | orchestrator | Tuesday 23 September 2025 19:12:24 +0000 (0:00:00.900) 0:00:10.284 ***** 2025-09-23 19:22:59.784713 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.784967 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.784978 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.784989 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.784999 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.785010 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.785020 | orchestrator | 2025-09-23 19:22:59.785047 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2025-09-23 19:22:59.785058 | orchestrator | Tuesday 23 September 2025 19:12:26 +0000 (0:00:01.449) 0:00:11.733 ***** 2025-09-23 19:22:59.785069 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:22:59.785141 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:22:59.785156 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:22:59.785167 | orchestrator | 2025-09-23 19:22:59.785178 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2025-09-23 19:22:59.785188 | orchestrator | Tuesday 23 September 2025 19:12:29 +0000 (0:00:03.256) 0:00:14.990 ***** 2025-09-23 19:22:59.785199 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-23 19:22:59.785210 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-23 19:22:59.785220 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-23 19:22:59.785231 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785242 | orchestrator | 2025-09-23 19:22:59.785252 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2025-09-23 19:22:59.785263 | orchestrator | Tuesday 23 September 2025 19:12:29 +0000 (0:00:00.417) 0:00:15.408 ***** 2025-09-23 19:22:59.785276 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785300 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785312 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785333 | orchestrator | 2025-09-23 19:22:59.785344 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2025-09-23 19:22:59.785355 | orchestrator | Tuesday 23 September 2025 19:12:30 +0000 (0:00:00.892) 0:00:16.300 ***** 2025-09-23 19:22:59.785368 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785381 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785393 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785404 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785415 | orchestrator | 2025-09-23 19:22:59.785426 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2025-09-23 19:22:59.785436 | orchestrator | Tuesday 23 September 2025 19:12:30 +0000 (0:00:00.116) 0:00:16.416 ***** 2025-09-23 19:22:59.785459 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2025-09-23 19:12:27.185516', 'end': '2025-09-23 19:12:27.435809', 'delta': '0:00:00.250293', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785479 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2025-09-23 19:12:28.010663', 'end': '2025-09-23 19:12:28.339955', 'delta': '0:00:00.329292', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785498 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2025-09-23 19:12:28.849870', 'end': '2025-09-23 19:12:29.118313', 'delta': '0:00:00.268443', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.785510 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785521 | orchestrator | 2025-09-23 19:22:59.785532 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2025-09-23 19:22:59.785544 | orchestrator | Tuesday 23 September 2025 19:12:31 +0000 (0:00:00.661) 0:00:17.077 ***** 2025-09-23 19:22:59.785564 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.785582 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.785600 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.785618 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.785636 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.785656 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.785667 | orchestrator | 2025-09-23 19:22:59.785678 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2025-09-23 19:22:59.785689 | orchestrator | Tuesday 23 September 2025 19:12:33 +0000 (0:00:01.938) 0:00:19.016 ***** 2025-09-23 19:22:59.785700 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.785711 | orchestrator | 2025-09-23 19:22:59.785721 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2025-09-23 19:22:59.785732 | orchestrator | Tuesday 23 September 2025 19:12:34 +0000 (0:00:00.970) 0:00:19.986 ***** 2025-09-23 19:22:59.785743 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785753 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.785764 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.785774 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.785785 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.785795 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.785806 | orchestrator | 2025-09-23 19:22:59.785817 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2025-09-23 19:22:59.785827 | orchestrator | Tuesday 23 September 2025 19:12:36 +0000 (0:00:02.085) 0:00:22.072 ***** 2025-09-23 19:22:59.785838 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785848 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.785859 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.785869 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.785880 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.785891 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.785901 | orchestrator | 2025-09-23 19:22:59.785912 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-23 19:22:59.785923 | orchestrator | Tuesday 23 September 2025 19:12:38 +0000 (0:00:01.756) 0:00:23.828 ***** 2025-09-23 19:22:59.785934 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.785944 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.785954 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.785965 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.785975 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.785986 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.785996 | orchestrator | 2025-09-23 19:22:59.786007 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2025-09-23 19:22:59.786065 | orchestrator | Tuesday 23 September 2025 19:12:38 +0000 (0:00:00.720) 0:00:24.548 ***** 2025-09-23 19:22:59.786079 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786123 | orchestrator | 2025-09-23 19:22:59.786135 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2025-09-23 19:22:59.786145 | orchestrator | Tuesday 23 September 2025 19:12:39 +0000 (0:00:00.099) 0:00:24.648 ***** 2025-09-23 19:22:59.786156 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786167 | orchestrator | 2025-09-23 19:22:59.786178 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-23 19:22:59.786188 | orchestrator | Tuesday 23 September 2025 19:12:39 +0000 (0:00:00.228) 0:00:24.876 ***** 2025-09-23 19:22:59.786199 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786210 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786220 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786231 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786242 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786252 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786263 | orchestrator | 2025-09-23 19:22:59.786283 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2025-09-23 19:22:59.786294 | orchestrator | Tuesday 23 September 2025 19:12:39 +0000 (0:00:00.552) 0:00:25.428 ***** 2025-09-23 19:22:59.786305 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786316 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786332 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786343 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786354 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786364 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786375 | orchestrator | 2025-09-23 19:22:59.786386 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2025-09-23 19:22:59.786396 | orchestrator | Tuesday 23 September 2025 19:12:40 +0000 (0:00:00.757) 0:00:26.186 ***** 2025-09-23 19:22:59.786407 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786418 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786428 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786439 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786449 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786460 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786470 | orchestrator | 2025-09-23 19:22:59.786481 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2025-09-23 19:22:59.786492 | orchestrator | Tuesday 23 September 2025 19:12:41 +0000 (0:00:00.835) 0:00:27.022 ***** 2025-09-23 19:22:59.786502 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786513 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786523 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786534 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786546 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786565 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786583 | orchestrator | 2025-09-23 19:22:59.786601 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2025-09-23 19:22:59.786618 | orchestrator | Tuesday 23 September 2025 19:12:42 +0000 (0:00:00.978) 0:00:28.000 ***** 2025-09-23 19:22:59.786638 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786656 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786673 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786684 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786695 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786705 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786716 | orchestrator | 2025-09-23 19:22:59.786727 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2025-09-23 19:22:59.786738 | orchestrator | Tuesday 23 September 2025 19:12:43 +0000 (0:00:00.638) 0:00:28.639 ***** 2025-09-23 19:22:59.786748 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786759 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786769 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786780 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786799 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786810 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786820 | orchestrator | 2025-09-23 19:22:59.786831 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2025-09-23 19:22:59.786842 | orchestrator | Tuesday 23 September 2025 19:12:43 +0000 (0:00:00.750) 0:00:29.389 ***** 2025-09-23 19:22:59.786853 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.786864 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.786874 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.786885 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.786895 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.786906 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.786916 | orchestrator | 2025-09-23 19:22:59.786927 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2025-09-23 19:22:59.786938 | orchestrator | Tuesday 23 September 2025 19:12:44 +0000 (0:00:00.553) 0:00:29.943 ***** 2025-09-23 19:22:59.786950 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e', 'dm-uuid-LVM-BYh1we6l1Rbny4mpPNGVfmFVqmlrDTdadBL2afMPVC7aYkeSl0VtWfEEDEItKBqD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.786963 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262', 'dm-uuid-LVM-NcdhDJBq0TBcw9ePnA00uXvA5tL30WE3Z4S8MCdepjendah0VppDSjGgz9nPXIRI'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.786982 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787001 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787013 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787024 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787046 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787057 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787068 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787144 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787182 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787197 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-qwNbwq-ZWIw-gtu3-bkEl-T6U4-liMO-iGzhMR', 'scsi-0QEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5', 'scsi-SQEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787218 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2FuINB-zcer-4mIL-BOFU-w1dA-hCsm-AWvBtO', 'scsi-0QEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd', 'scsi-SQEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787230 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6', 'scsi-SQEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787242 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-43-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787295 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304', 'dm-uuid-LVM-nw8hRIb2eDpdk169y1rdcFUze1XfuOjJllJ9bGkQ0w0EH5YlPs5Idof0C67ssk46'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787305 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee', 'dm-uuid-LVM-XJfizJhV9UhWBv2FwGTBmsjdeRQx0bAnGRp8GEaT01hL7vlh46uUFrtFT6WiLhoZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787316 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787332 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787342 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.787353 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787362 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787372 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787382 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787392 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787420 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787435 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5', 'dm-uuid-LVM-cTPR2qR6Zc8oAkE17BbZLrodQs1QMSHCIyAIczA6d59xBSXvG9KA9cu5ghiYSaro'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787454 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787466 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-A4Gl3a-uEFo-1YjV-onOt-lDti-Rblb-3dFZee', 'scsi-0QEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85', 'scsi-SQEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787487 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42', 'dm-uuid-LVM-QxliPBJmTpLitQexep3vAZAasAjeKSby8Zpqm5RSUCw8quKD9lV8fEk8m3kUJSyu'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787498 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Ke1zo1-HE7e-DXga-aXdS-u4PO-3JOJ-cGNfpd', 'scsi-0QEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd', 'scsi-SQEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787515 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787526 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8', 'scsi-SQEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787536 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787549 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-40-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787568 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787585 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787611 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787631 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787648 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.787658 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787668 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787680 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787701 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-E2n78L-aS7J-rCCR-K0MN-C7Uz-Tc8Z-2fbQrV', 'scsi-0QEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b', 'scsi-SQEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787719 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zZMqce-nVYl-F3hw-V2eM-fbVa-gvW2-fcBvFm', 'scsi-0QEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052', 'scsi-SQEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787729 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787739 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e', 'scsi-SQEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787750 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787760 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-41-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787770 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787780 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787801 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787817 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787866 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787876 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787887 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part1', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part14', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part15', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part16', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787905 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-44-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.787921 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.787935 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787946 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787956 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787965 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787976 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787986 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.787995 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788006 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788028 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part1', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part14', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part15', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part16', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.788051 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-40-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.788061 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.788071 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.788104 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788116 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788126 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788143 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788164 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788175 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788185 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788194 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:22:59.788205 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part1', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part14', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part15', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part16', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.788229 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-46-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:22:59.788239 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.788249 | orchestrator | 2025-09-23 19:22:59.788259 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2025-09-23 19:22:59.788274 | orchestrator | Tuesday 23 September 2025 19:12:45 +0000 (0:00:01.652) 0:00:31.596 ***** 2025-09-23 19:22:59.788285 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e', 'dm-uuid-LVM-BYh1we6l1Rbny4mpPNGVfmFVqmlrDTdadBL2afMPVC7aYkeSl0VtWfEEDEItKBqD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788297 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262', 'dm-uuid-LVM-NcdhDJBq0TBcw9ePnA00uXvA5tL30WE3Z4S8MCdepjendah0VppDSjGgz9nPXIRI'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788308 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788318 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788334 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788355 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304', 'dm-uuid-LVM-nw8hRIb2eDpdk169y1rdcFUze1XfuOjJllJ9bGkQ0w0EH5YlPs5Idof0C67ssk46'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788366 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee', 'dm-uuid-LVM-XJfizJhV9UhWBv2FwGTBmsjdeRQx0bAnGRp8GEaT01hL7vlh46uUFrtFT6WiLhoZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788376 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788386 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788396 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788412 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788428 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788442 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788453 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788463 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788473 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788483 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788513 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788526 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-qwNbwq-ZWIw-gtu3-bkEl-T6U4-liMO-iGzhMR', 'scsi-0QEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5', 'scsi-SQEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788537 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2FuINB-zcer-4mIL-BOFU-w1dA-hCsm-AWvBtO', 'scsi-0QEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd', 'scsi-SQEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788568 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6', 'scsi-SQEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788591 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-43-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788609 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788623 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788634 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788665 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788677 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5', 'dm-uuid-LVM-cTPR2qR6Zc8oAkE17BbZLrodQs1QMSHCIyAIczA6d59xBSXvG9KA9cu5ghiYSaro'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788688 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-A4Gl3a-uEFo-1YjV-onOt-lDti-Rblb-3dFZee', 'scsi-0QEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85', 'scsi-SQEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788707 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Ke1zo1-HE7e-DXga-aXdS-u4PO-3JOJ-cGNfpd', 'scsi-0QEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd', 'scsi-SQEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788728 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42', 'dm-uuid-LVM-QxliPBJmTpLitQexep3vAZAasAjeKSby8Zpqm5RSUCw8quKD9lV8fEk8m3kUJSyu'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788740 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8', 'scsi-SQEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788750 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-40-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788760 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788776 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788786 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.788796 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.788811 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_han2025-09-23 19:22:59 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:22:59.788822 | orchestrator | 2025-09-23 19:22:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:22:59.789001 | orchestrator | dle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789020 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789030 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789040 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789059 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789105 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789119 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-E2n78L-aS7J-rCCR-K0MN-C7Uz-Tc8Z-2fbQrV', 'scsi-0QEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b', 'scsi-SQEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789137 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789147 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789158 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zZMqce-nVYl-F3hw-V2eM-fbVa-gvW2-fcBvFm', 'scsi-0QEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052', 'scsi-SQEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789178 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789189 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e', 'scsi-SQEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789199 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789215 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789225 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789236 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-41-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789255 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789266 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789277 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part1', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part14', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part15', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part16', 'scsi-SQEMU_QEMU_HARDDISK_111e41fd-1cdd-43db-a49a-f2bb4cafdaf0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789298 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-44-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789312 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.789343 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789354 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789370 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789380 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789390 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789401 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789422 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789433 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789443 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.789458 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.789469 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part1', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part14', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part15', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part16', 'scsi-SQEMU_QEMU_HARDDISK_48d13180-cb46-42fb-bb48-4118091051be-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789480 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-40-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789491 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.789506 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789517 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789636 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789680 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789699 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789711 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789738 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789750 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789772 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part1', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part14', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part15', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part16', 'scsi-SQEMU_QEMU_HARDDISK_40586798-a938-4a0a-ac1b-5e3307fb08ff-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789784 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-46-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:22:59.789796 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.789808 | orchestrator | 2025-09-23 19:22:59.789819 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2025-09-23 19:22:59.789830 | orchestrator | Tuesday 23 September 2025 19:12:48 +0000 (0:00:02.516) 0:00:34.112 ***** 2025-09-23 19:22:59.789850 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.789862 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.789873 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.789884 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.789895 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.789905 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.789916 | orchestrator | 2025-09-23 19:22:59.789928 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2025-09-23 19:22:59.789946 | orchestrator | Tuesday 23 September 2025 19:12:49 +0000 (0:00:01.427) 0:00:35.540 ***** 2025-09-23 19:22:59.789956 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.789965 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.789975 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.789984 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.789993 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.790003 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.790012 | orchestrator | 2025-09-23 19:22:59.790073 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-23 19:22:59.790107 | orchestrator | Tuesday 23 September 2025 19:12:50 +0000 (0:00:00.759) 0:00:36.299 ***** 2025-09-23 19:22:59.790118 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.790128 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.790138 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.790147 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.790157 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.790167 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.790177 | orchestrator | 2025-09-23 19:22:59.790186 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-23 19:22:59.790196 | orchestrator | Tuesday 23 September 2025 19:12:51 +0000 (0:00:00.897) 0:00:37.197 ***** 2025-09-23 19:22:59.790206 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.790215 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.790225 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.790234 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.790244 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.790253 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.790266 | orchestrator | 2025-09-23 19:22:59.790282 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-23 19:22:59.790307 | orchestrator | Tuesday 23 September 2025 19:12:52 +0000 (0:00:00.652) 0:00:37.850 ***** 2025-09-23 19:22:59.790323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.790338 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.790354 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.790369 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.790384 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.790399 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.790415 | orchestrator | 2025-09-23 19:22:59.790432 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-23 19:22:59.790448 | orchestrator | Tuesday 23 September 2025 19:12:53 +0000 (0:00:01.235) 0:00:39.085 ***** 2025-09-23 19:22:59.790464 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.790480 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.790495 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.790512 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.790528 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.790543 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.790560 | orchestrator | 2025-09-23 19:22:59.790576 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2025-09-23 19:22:59.790591 | orchestrator | Tuesday 23 September 2025 19:12:55 +0000 (0:00:01.553) 0:00:40.639 ***** 2025-09-23 19:22:59.790606 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2025-09-23 19:22:59.790621 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2025-09-23 19:22:59.790635 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2025-09-23 19:22:59.790650 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2025-09-23 19:22:59.790698 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-23 19:22:59.790715 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2025-09-23 19:22:59.790730 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2025-09-23 19:22:59.790746 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2025-09-23 19:22:59.790774 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2025-09-23 19:22:59.790790 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-0) 2025-09-23 19:22:59.790806 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2025-09-23 19:22:59.790830 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-1) 2025-09-23 19:22:59.790850 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2025-09-23 19:22:59.790866 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-2) 2025-09-23 19:22:59.790882 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2025-09-23 19:22:59.790898 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-0) 2025-09-23 19:22:59.790913 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-1) 2025-09-23 19:22:59.790928 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-2) 2025-09-23 19:22:59.790944 | orchestrator | 2025-09-23 19:22:59.790971 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2025-09-23 19:22:59.790987 | orchestrator | Tuesday 23 September 2025 19:12:58 +0000 (0:00:03.763) 0:00:44.402 ***** 2025-09-23 19:22:59.791003 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-23 19:22:59.791018 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-23 19:22:59.791033 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-23 19:22:59.791049 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.791064 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-09-23 19:22:59.791079 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-09-23 19:22:59.791171 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-09-23 19:22:59.791184 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.791196 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-09-23 19:22:59.791208 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-09-23 19:22:59.791243 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-09-23 19:22:59.791263 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.791275 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-23 19:22:59.791286 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-23 19:22:59.791297 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-23 19:22:59.791310 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.791322 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2025-09-23 19:22:59.791334 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2025-09-23 19:22:59.791354 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2025-09-23 19:22:59.791371 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.791391 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2025-09-23 19:22:59.791405 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2025-09-23 19:22:59.791418 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2025-09-23 19:22:59.791431 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.791444 | orchestrator | 2025-09-23 19:22:59.791463 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2025-09-23 19:22:59.791482 | orchestrator | Tuesday 23 September 2025 19:13:00 +0000 (0:00:01.427) 0:00:45.830 ***** 2025-09-23 19:22:59.791501 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.791515 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.791528 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.791543 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.791556 | orchestrator | 2025-09-23 19:22:59.791570 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2025-09-23 19:22:59.791591 | orchestrator | Tuesday 23 September 2025 19:13:02 +0000 (0:00:01.938) 0:00:47.769 ***** 2025-09-23 19:22:59.791624 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.791639 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.791652 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.791664 | orchestrator | 2025-09-23 19:22:59.791677 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2025-09-23 19:22:59.791691 | orchestrator | Tuesday 23 September 2025 19:13:02 +0000 (0:00:00.498) 0:00:48.268 ***** 2025-09-23 19:22:59.791704 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.791717 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.791731 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.791743 | orchestrator | 2025-09-23 19:22:59.791756 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2025-09-23 19:22:59.791769 | orchestrator | Tuesday 23 September 2025 19:13:03 +0000 (0:00:00.679) 0:00:48.947 ***** 2025-09-23 19:22:59.791783 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.791796 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.791808 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.791821 | orchestrator | 2025-09-23 19:22:59.791835 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2025-09-23 19:22:59.791848 | orchestrator | Tuesday 23 September 2025 19:13:04 +0000 (0:00:01.122) 0:00:50.069 ***** 2025-09-23 19:22:59.791861 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.791874 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.791888 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.791902 | orchestrator | 2025-09-23 19:22:59.791918 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2025-09-23 19:22:59.791936 | orchestrator | Tuesday 23 September 2025 19:13:05 +0000 (0:00:00.837) 0:00:50.906 ***** 2025-09-23 19:22:59.791957 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.791972 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.791985 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.791998 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.792011 | orchestrator | 2025-09-23 19:22:59.792029 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2025-09-23 19:22:59.792045 | orchestrator | Tuesday 23 September 2025 19:13:05 +0000 (0:00:00.521) 0:00:51.428 ***** 2025-09-23 19:22:59.792059 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.792072 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.792106 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.792120 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.792133 | orchestrator | 2025-09-23 19:22:59.792146 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2025-09-23 19:22:59.792159 | orchestrator | Tuesday 23 September 2025 19:13:06 +0000 (0:00:00.335) 0:00:51.764 ***** 2025-09-23 19:22:59.792173 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.792181 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.792189 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.792197 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.792204 | orchestrator | 2025-09-23 19:22:59.792212 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2025-09-23 19:22:59.792220 | orchestrator | Tuesday 23 September 2025 19:13:06 +0000 (0:00:00.352) 0:00:52.117 ***** 2025-09-23 19:22:59.792228 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.792236 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.792243 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.792251 | orchestrator | 2025-09-23 19:22:59.792259 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2025-09-23 19:22:59.792267 | orchestrator | Tuesday 23 September 2025 19:13:06 +0000 (0:00:00.359) 0:00:52.476 ***** 2025-09-23 19:22:59.792274 | orchestrator | ok: [testbed-node-4] => (item=0) 2025-09-23 19:22:59.792290 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-23 19:22:59.792298 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-23 19:22:59.792306 | orchestrator | 2025-09-23 19:22:59.792323 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2025-09-23 19:22:59.792337 | orchestrator | Tuesday 23 September 2025 19:13:08 +0000 (0:00:02.028) 0:00:54.505 ***** 2025-09-23 19:22:59.792345 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:22:59.792353 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:22:59.792361 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:22:59.792369 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2025-09-23 19:22:59.792376 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-23 19:22:59.792384 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-23 19:22:59.792392 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-23 19:22:59.792400 | orchestrator | 2025-09-23 19:22:59.792407 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2025-09-23 19:22:59.792415 | orchestrator | Tuesday 23 September 2025 19:13:09 +0000 (0:00:01.040) 0:00:55.545 ***** 2025-09-23 19:22:59.792423 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:22:59.792430 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:22:59.792438 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:22:59.792446 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2025-09-23 19:22:59.792453 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-23 19:22:59.792461 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-23 19:22:59.792469 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-23 19:22:59.792476 | orchestrator | 2025-09-23 19:22:59.792484 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.792492 | orchestrator | Tuesday 23 September 2025 19:13:11 +0000 (0:00:01.942) 0:00:57.488 ***** 2025-09-23 19:22:59.792500 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.792509 | orchestrator | 2025-09-23 19:22:59.792517 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.792525 | orchestrator | Tuesday 23 September 2025 19:13:13 +0000 (0:00:01.310) 0:00:58.798 ***** 2025-09-23 19:22:59.792532 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-0, testbed-node-1, testbed-node-5, testbed-node-2 2025-09-23 19:22:59.792541 | orchestrator | 2025-09-23 19:22:59.792548 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.792556 | orchestrator | Tuesday 23 September 2025 19:13:14 +0000 (0:00:01.568) 0:01:00.367 ***** 2025-09-23 19:22:59.792564 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.792571 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.792579 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.792587 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.792595 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.792602 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.792610 | orchestrator | 2025-09-23 19:22:59.792618 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.792626 | orchestrator | Tuesday 23 September 2025 19:13:16 +0000 (0:00:01.399) 0:01:01.767 ***** 2025-09-23 19:22:59.792633 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.792646 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.792654 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.792662 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.792669 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.792677 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.792685 | orchestrator | 2025-09-23 19:22:59.792693 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.792700 | orchestrator | Tuesday 23 September 2025 19:13:17 +0000 (0:00:01.215) 0:01:02.982 ***** 2025-09-23 19:22:59.792708 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.792716 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.792723 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.792731 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.792739 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.792747 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.792754 | orchestrator | 2025-09-23 19:22:59.792762 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.792770 | orchestrator | Tuesday 23 September 2025 19:13:18 +0000 (0:00:01.046) 0:01:04.029 ***** 2025-09-23 19:22:59.792778 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.792785 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.792793 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.792801 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.792808 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.792816 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.792824 | orchestrator | 2025-09-23 19:22:59.792832 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.792839 | orchestrator | Tuesday 23 September 2025 19:13:19 +0000 (0:00:00.879) 0:01:04.908 ***** 2025-09-23 19:22:59.792847 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.792855 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.792863 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.792870 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.792878 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.792886 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.792893 | orchestrator | 2025-09-23 19:22:59.792901 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.792917 | orchestrator | Tuesday 23 September 2025 19:13:20 +0000 (0:00:01.223) 0:01:06.132 ***** 2025-09-23 19:22:59.792926 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.792934 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.792941 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.792949 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.792957 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.792964 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.792972 | orchestrator | 2025-09-23 19:22:59.792980 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.792987 | orchestrator | Tuesday 23 September 2025 19:13:21 +0000 (0:00:00.548) 0:01:06.680 ***** 2025-09-23 19:22:59.792995 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.793003 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.793010 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.793018 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793026 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793033 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793041 | orchestrator | 2025-09-23 19:22:59.793049 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.793057 | orchestrator | Tuesday 23 September 2025 19:13:21 +0000 (0:00:00.520) 0:01:07.200 ***** 2025-09-23 19:22:59.793064 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793072 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793092 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793101 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.793108 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.793122 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.793130 | orchestrator | 2025-09-23 19:22:59.793137 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.793145 | orchestrator | Tuesday 23 September 2025 19:13:23 +0000 (0:00:01.777) 0:01:08.978 ***** 2025-09-23 19:22:59.793153 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793160 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793168 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793176 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.793184 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.793191 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.793199 | orchestrator | 2025-09-23 19:22:59.793207 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.793215 | orchestrator | Tuesday 23 September 2025 19:13:24 +0000 (0:00:01.297) 0:01:10.275 ***** 2025-09-23 19:22:59.793223 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.793230 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.793238 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.793246 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793254 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793261 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793269 | orchestrator | 2025-09-23 19:22:59.793277 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.793285 | orchestrator | Tuesday 23 September 2025 19:13:25 +0000 (0:00:00.960) 0:01:11.235 ***** 2025-09-23 19:22:59.793292 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.793300 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.793308 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.793315 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.793323 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.793331 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.793339 | orchestrator | 2025-09-23 19:22:59.793347 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.793354 | orchestrator | Tuesday 23 September 2025 19:13:26 +0000 (0:00:00.641) 0:01:11.877 ***** 2025-09-23 19:22:59.793362 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793370 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793378 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793385 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793393 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793401 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793408 | orchestrator | 2025-09-23 19:22:59.793416 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.793424 | orchestrator | Tuesday 23 September 2025 19:13:26 +0000 (0:00:00.724) 0:01:12.601 ***** 2025-09-23 19:22:59.793432 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793440 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793447 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793455 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793463 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793470 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793478 | orchestrator | 2025-09-23 19:22:59.793486 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.793494 | orchestrator | Tuesday 23 September 2025 19:13:27 +0000 (0:00:00.713) 0:01:13.315 ***** 2025-09-23 19:22:59.793501 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793509 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793517 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793524 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793532 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793540 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793547 | orchestrator | 2025-09-23 19:22:59.793555 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.793563 | orchestrator | Tuesday 23 September 2025 19:13:28 +0000 (0:00:00.726) 0:01:14.041 ***** 2025-09-23 19:22:59.793576 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.793584 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.793592 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.793599 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793607 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793615 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793622 | orchestrator | 2025-09-23 19:22:59.793630 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.793638 | orchestrator | Tuesday 23 September 2025 19:13:29 +0000 (0:00:00.609) 0:01:14.651 ***** 2025-09-23 19:22:59.793646 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.793653 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.793661 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.793669 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.793677 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.793684 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.793692 | orchestrator | 2025-09-23 19:22:59.793713 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.793722 | orchestrator | Tuesday 23 September 2025 19:13:29 +0000 (0:00:00.682) 0:01:15.334 ***** 2025-09-23 19:22:59.793730 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.793737 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.793745 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.793753 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.793760 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.793768 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.793776 | orchestrator | 2025-09-23 19:22:59.793784 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.793791 | orchestrator | Tuesday 23 September 2025 19:13:30 +0000 (0:00:00.549) 0:01:15.884 ***** 2025-09-23 19:22:59.793799 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793807 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793815 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793822 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.793830 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.793838 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.793845 | orchestrator | 2025-09-23 19:22:59.793853 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.793861 | orchestrator | Tuesday 23 September 2025 19:13:31 +0000 (0:00:00.766) 0:01:16.651 ***** 2025-09-23 19:22:59.793869 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.793876 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.793884 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.793892 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.793899 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.793907 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.793915 | orchestrator | 2025-09-23 19:22:59.793922 | orchestrator | TASK [ceph-container-common : Generate systemd ceph target file] *************** 2025-09-23 19:22:59.793930 | orchestrator | Tuesday 23 September 2025 19:13:32 +0000 (0:00:01.060) 0:01:17.711 ***** 2025-09-23 19:22:59.793938 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.793946 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.793954 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.793961 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.793969 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.793977 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.793984 | orchestrator | 2025-09-23 19:22:59.793992 | orchestrator | TASK [ceph-container-common : Enable ceph.target] ****************************** 2025-09-23 19:22:59.794000 | orchestrator | Tuesday 23 September 2025 19:13:33 +0000 (0:00:01.435) 0:01:19.147 ***** 2025-09-23 19:22:59.794008 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.794052 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.794062 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.794070 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.794126 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.794135 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.794143 | orchestrator | 2025-09-23 19:22:59.794151 | orchestrator | TASK [ceph-container-common : Include prerequisites.yml] *********************** 2025-09-23 19:22:59.794159 | orchestrator | Tuesday 23 September 2025 19:13:35 +0000 (0:00:02.179) 0:01:21.326 ***** 2025-09-23 19:22:59.794167 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/prerequisites.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.794175 | orchestrator | 2025-09-23 19:22:59.794182 | orchestrator | TASK [ceph-container-common : Stop lvmetad] ************************************ 2025-09-23 19:22:59.794190 | orchestrator | Tuesday 23 September 2025 19:13:36 +0000 (0:00:01.098) 0:01:22.425 ***** 2025-09-23 19:22:59.794198 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.794206 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.794213 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.794221 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.794229 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.794237 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.794244 | orchestrator | 2025-09-23 19:22:59.794252 | orchestrator | TASK [ceph-container-common : Disable and mask lvmetad service] **************** 2025-09-23 19:22:59.794260 | orchestrator | Tuesday 23 September 2025 19:13:37 +0000 (0:00:00.513) 0:01:22.938 ***** 2025-09-23 19:22:59.794267 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.794275 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.794283 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.794290 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.794298 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.794306 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.794313 | orchestrator | 2025-09-23 19:22:59.794321 | orchestrator | TASK [ceph-container-common : Remove ceph udev rules] ************************** 2025-09-23 19:22:59.794329 | orchestrator | Tuesday 23 September 2025 19:13:38 +0000 (0:00:00.678) 0:01:23.617 ***** 2025-09-23 19:22:59.794337 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-23 19:22:59.794345 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-23 19:22:59.794351 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-23 19:22:59.794358 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-23 19:22:59.794364 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-23 19:22:59.794371 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-23 19:22:59.794377 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-23 19:22:59.794384 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-23 19:22:59.794390 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-23 19:22:59.794397 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-23 19:22:59.794404 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-23 19:22:59.794427 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-23 19:22:59.794434 | orchestrator | 2025-09-23 19:22:59.794440 | orchestrator | TASK [ceph-container-common : Ensure tmpfiles.d is present] ******************** 2025-09-23 19:22:59.794447 | orchestrator | Tuesday 23 September 2025 19:13:39 +0000 (0:00:01.511) 0:01:25.128 ***** 2025-09-23 19:22:59.794454 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.794460 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.794467 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.794474 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.794485 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.794491 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.794498 | orchestrator | 2025-09-23 19:22:59.794504 | orchestrator | TASK [ceph-container-common : Restore certificates selinux context] ************ 2025-09-23 19:22:59.794511 | orchestrator | Tuesday 23 September 2025 19:13:40 +0000 (0:00:01.094) 0:01:26.223 ***** 2025-09-23 19:22:59.794518 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.794524 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.794531 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.794537 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.794544 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.794550 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.794557 | orchestrator | 2025-09-23 19:22:59.794563 | orchestrator | TASK [ceph-container-common : Install python3 on osd nodes] ******************** 2025-09-23 19:22:59.794570 | orchestrator | Tuesday 23 September 2025 19:13:41 +0000 (0:00:00.540) 0:01:26.763 ***** 2025-09-23 19:22:59.794576 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.794583 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.794589 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.794596 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.794602 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.794609 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.794615 | orchestrator | 2025-09-23 19:22:59.794622 | orchestrator | TASK [ceph-container-common : Include registry.yml] **************************** 2025-09-23 19:22:59.794628 | orchestrator | Tuesday 23 September 2025 19:13:41 +0000 (0:00:00.655) 0:01:27.419 ***** 2025-09-23 19:22:59.794635 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.794641 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.794648 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.794654 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.794661 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.794667 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.794674 | orchestrator | 2025-09-23 19:22:59.794680 | orchestrator | TASK [ceph-container-common : Include fetch_image.yml] ************************* 2025-09-23 19:22:59.794687 | orchestrator | Tuesday 23 September 2025 19:13:42 +0000 (0:00:00.511) 0:01:27.931 ***** 2025-09-23 19:22:59.794694 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/fetch_image.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.794700 | orchestrator | 2025-09-23 19:22:59.794707 | orchestrator | TASK [ceph-container-common : Pulling Ceph container image] ******************** 2025-09-23 19:22:59.794714 | orchestrator | Tuesday 23 September 2025 19:13:43 +0000 (0:00:00.994) 0:01:28.925 ***** 2025-09-23 19:22:59.794720 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.794727 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.794733 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.794740 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.794747 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.794753 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.794760 | orchestrator | 2025-09-23 19:22:59.794766 | orchestrator | TASK [ceph-container-common : Pulling alertmanager/prometheus/grafana container images] *** 2025-09-23 19:22:59.794773 | orchestrator | Tuesday 23 September 2025 19:14:23 +0000 (0:00:39.989) 0:02:08.915 ***** 2025-09-23 19:22:59.794780 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-23 19:22:59.794786 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-23 19:22:59.794793 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-23 19:22:59.794799 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.794806 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-23 19:22:59.794812 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-23 19:22:59.794819 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-23 19:22:59.794830 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.794836 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-23 19:22:59.794843 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-23 19:22:59.794849 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-23 19:22:59.794856 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.794863 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-23 19:22:59.794869 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-23 19:22:59.794876 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-23 19:22:59.794882 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-23 19:22:59.794889 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-23 19:22:59.794895 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-23 19:22:59.794906 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.794918 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.794930 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-23 19:22:59.794947 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-23 19:22:59.794963 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-23 19:22:59.794977 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.794988 | orchestrator | 2025-09-23 19:22:59.795000 | orchestrator | TASK [ceph-container-common : Pulling node-exporter container image] *********** 2025-09-23 19:22:59.795012 | orchestrator | Tuesday 23 September 2025 19:14:24 +0000 (0:00:00.772) 0:02:09.688 ***** 2025-09-23 19:22:59.795023 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795034 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795045 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795055 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795065 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795077 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795105 | orchestrator | 2025-09-23 19:22:59.795112 | orchestrator | TASK [ceph-container-common : Export local ceph dev image] ********************* 2025-09-23 19:22:59.795122 | orchestrator | Tuesday 23 September 2025 19:14:24 +0000 (0:00:00.796) 0:02:10.484 ***** 2025-09-23 19:22:59.795133 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795144 | orchestrator | 2025-09-23 19:22:59.795155 | orchestrator | TASK [ceph-container-common : Copy ceph dev image file] ************************ 2025-09-23 19:22:59.795165 | orchestrator | Tuesday 23 September 2025 19:14:25 +0000 (0:00:00.153) 0:02:10.638 ***** 2025-09-23 19:22:59.795176 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795188 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795199 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795208 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795215 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795221 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795228 | orchestrator | 2025-09-23 19:22:59.795234 | orchestrator | TASK [ceph-container-common : Load ceph dev image] ***************************** 2025-09-23 19:22:59.795241 | orchestrator | Tuesday 23 September 2025 19:14:25 +0000 (0:00:00.744) 0:02:11.383 ***** 2025-09-23 19:22:59.795247 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795254 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795261 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795267 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795274 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795280 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795287 | orchestrator | 2025-09-23 19:22:59.795293 | orchestrator | TASK [ceph-container-common : Remove tmp ceph dev image file] ****************** 2025-09-23 19:22:59.795306 | orchestrator | Tuesday 23 September 2025 19:14:26 +0000 (0:00:01.089) 0:02:12.473 ***** 2025-09-23 19:22:59.795313 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795319 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795326 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795333 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795339 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795346 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795352 | orchestrator | 2025-09-23 19:22:59.795359 | orchestrator | TASK [ceph-container-common : Get ceph version] ******************************** 2025-09-23 19:22:59.795365 | orchestrator | Tuesday 23 September 2025 19:14:27 +0000 (0:00:00.939) 0:02:13.413 ***** 2025-09-23 19:22:59.795372 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.795378 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.795385 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.795391 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.795398 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.795404 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.795411 | orchestrator | 2025-09-23 19:22:59.795418 | orchestrator | TASK [ceph-container-common : Set_fact ceph_version ceph_version.stdout.split] *** 2025-09-23 19:22:59.795424 | orchestrator | Tuesday 23 September 2025 19:14:30 +0000 (0:00:03.169) 0:02:16.583 ***** 2025-09-23 19:22:59.795431 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.795437 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.795444 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.795450 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.795457 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.795463 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.795470 | orchestrator | 2025-09-23 19:22:59.795476 | orchestrator | TASK [ceph-container-common : Include release.yml] ***************************** 2025-09-23 19:22:59.795483 | orchestrator | Tuesday 23 September 2025 19:14:31 +0000 (0:00:00.668) 0:02:17.252 ***** 2025-09-23 19:22:59.795490 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/release.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.795498 | orchestrator | 2025-09-23 19:22:59.795505 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release jewel] ********************* 2025-09-23 19:22:59.795511 | orchestrator | Tuesday 23 September 2025 19:14:32 +0000 (0:00:01.006) 0:02:18.258 ***** 2025-09-23 19:22:59.795518 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795524 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795531 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795537 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795544 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795550 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795557 | orchestrator | 2025-09-23 19:22:59.795563 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release kraken] ******************** 2025-09-23 19:22:59.795570 | orchestrator | Tuesday 23 September 2025 19:14:33 +0000 (0:00:00.638) 0:02:18.897 ***** 2025-09-23 19:22:59.795577 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795583 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795590 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795596 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795603 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795609 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795616 | orchestrator | 2025-09-23 19:22:59.795622 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release luminous] ****************** 2025-09-23 19:22:59.795629 | orchestrator | Tuesday 23 September 2025 19:14:33 +0000 (0:00:00.510) 0:02:19.408 ***** 2025-09-23 19:22:59.795635 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795642 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795648 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795655 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795662 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795677 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795684 | orchestrator | 2025-09-23 19:22:59.795695 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release mimic] ********************* 2025-09-23 19:22:59.795702 | orchestrator | Tuesday 23 September 2025 19:14:34 +0000 (0:00:00.700) 0:02:20.109 ***** 2025-09-23 19:22:59.795708 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795715 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795721 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795728 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795735 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795742 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795754 | orchestrator | 2025-09-23 19:22:59.795765 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release nautilus] ****************** 2025-09-23 19:22:59.795775 | orchestrator | Tuesday 23 September 2025 19:14:35 +0000 (0:00:00.861) 0:02:20.970 ***** 2025-09-23 19:22:59.795786 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795797 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795808 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795815 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795822 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795828 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795835 | orchestrator | 2025-09-23 19:22:59.795842 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release octopus] ******************* 2025-09-23 19:22:59.795851 | orchestrator | Tuesday 23 September 2025 19:14:36 +0000 (0:00:00.678) 0:02:21.649 ***** 2025-09-23 19:22:59.795862 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795873 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795884 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795895 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.795906 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795919 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.795926 | orchestrator | 2025-09-23 19:22:59.795932 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release pacific] ******************* 2025-09-23 19:22:59.795939 | orchestrator | Tuesday 23 September 2025 19:14:36 +0000 (0:00:00.694) 0:02:22.344 ***** 2025-09-23 19:22:59.795946 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.795957 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.795968 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.795979 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.795990 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.796001 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.796012 | orchestrator | 2025-09-23 19:22:59.796023 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release quincy] ******************** 2025-09-23 19:22:59.796033 | orchestrator | Tuesday 23 September 2025 19:14:37 +0000 (0:00:00.473) 0:02:22.818 ***** 2025-09-23 19:22:59.796040 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.796046 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.796055 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.796066 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.796077 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.796130 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.796142 | orchestrator | 2025-09-23 19:22:59.796153 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release reef] ********************** 2025-09-23 19:22:59.796165 | orchestrator | Tuesday 23 September 2025 19:14:37 +0000 (0:00:00.655) 0:02:23.474 ***** 2025-09-23 19:22:59.796176 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.796187 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.796199 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.796210 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.796221 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.796232 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.796243 | orchestrator | 2025-09-23 19:22:59.796253 | orchestrator | TASK [ceph-config : Include create_ceph_initial_dirs.yml] ********************** 2025-09-23 19:22:59.796271 | orchestrator | Tuesday 23 September 2025 19:14:39 +0000 (0:00:01.230) 0:02:24.704 ***** 2025-09-23 19:22:59.796281 | orchestrator | included: /ansible/roles/ceph-config/tasks/create_ceph_initial_dirs.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.796291 | orchestrator | 2025-09-23 19:22:59.796301 | orchestrator | TASK [ceph-config : Create ceph initial directories] *************************** 2025-09-23 19:22:59.796311 | orchestrator | Tuesday 23 September 2025 19:14:40 +0000 (0:00:00.913) 0:02:25.618 ***** 2025-09-23 19:22:59.796322 | orchestrator | changed: [testbed-node-5] => (item=/etc/ceph) 2025-09-23 19:22:59.796329 | orchestrator | changed: [testbed-node-0] => (item=/etc/ceph) 2025-09-23 19:22:59.796335 | orchestrator | changed: [testbed-node-1] => (item=/etc/ceph) 2025-09-23 19:22:59.796341 | orchestrator | changed: [testbed-node-2] => (item=/etc/ceph) 2025-09-23 19:22:59.796347 | orchestrator | changed: [testbed-node-3] => (item=/etc/ceph) 2025-09-23 19:22:59.796353 | orchestrator | changed: [testbed-node-4] => (item=/etc/ceph) 2025-09-23 19:22:59.796359 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/) 2025-09-23 19:22:59.796365 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/) 2025-09-23 19:22:59.796372 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/) 2025-09-23 19:22:59.796378 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/) 2025-09-23 19:22:59.796384 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/) 2025-09-23 19:22:59.796390 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/) 2025-09-23 19:22:59.796396 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mon) 2025-09-23 19:22:59.796402 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mon) 2025-09-23 19:22:59.796408 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mon) 2025-09-23 19:22:59.796414 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mon) 2025-09-23 19:22:59.796420 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mon) 2025-09-23 19:22:59.796426 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mon) 2025-09-23 19:22:59.796432 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/osd) 2025-09-23 19:22:59.796438 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/osd) 2025-09-23 19:22:59.796454 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/osd) 2025-09-23 19:22:59.796460 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/osd) 2025-09-23 19:22:59.796467 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/osd) 2025-09-23 19:22:59.796473 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/osd) 2025-09-23 19:22:59.796479 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds) 2025-09-23 19:22:59.796485 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mds) 2025-09-23 19:22:59.796491 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mds) 2025-09-23 19:22:59.796497 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds) 2025-09-23 19:22:59.796503 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds) 2025-09-23 19:22:59.796509 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mds) 2025-09-23 19:22:59.796515 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/tmp) 2025-09-23 19:22:59.796521 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/tmp) 2025-09-23 19:22:59.796527 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/tmp) 2025-09-23 19:22:59.796533 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/tmp) 2025-09-23 19:22:59.796539 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/tmp) 2025-09-23 19:22:59.796545 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/tmp) 2025-09-23 19:22:59.796551 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/crash) 2025-09-23 19:22:59.796557 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/crash) 2025-09-23 19:22:59.796568 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/crash) 2025-09-23 19:22:59.796574 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/crash) 2025-09-23 19:22:59.796580 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/crash) 2025-09-23 19:22:59.796586 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/crash) 2025-09-23 19:22:59.796593 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/radosgw) 2025-09-23 19:22:59.796599 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/radosgw) 2025-09-23 19:22:59.796605 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/radosgw) 2025-09-23 19:22:59.796611 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/radosgw) 2025-09-23 19:22:59.796617 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/radosgw) 2025-09-23 19:22:59.796623 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/radosgw) 2025-09-23 19:22:59.796629 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-23 19:22:59.796635 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-23 19:22:59.796641 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-23 19:22:59.796647 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-23 19:22:59.796653 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-23 19:22:59.796659 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-23 19:22:59.796665 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-23 19:22:59.796671 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-23 19:22:59.796678 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-23 19:22:59.796684 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-23 19:22:59.796690 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-23 19:22:59.796696 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-23 19:22:59.796702 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-23 19:22:59.796708 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-23 19:22:59.796714 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-23 19:22:59.796720 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-23 19:22:59.796726 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-23 19:22:59.796732 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-23 19:22:59.796738 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-23 19:22:59.796744 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-23 19:22:59.796750 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-23 19:22:59.796756 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-23 19:22:59.796762 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-23 19:22:59.796768 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-23 19:22:59.796774 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-23 19:22:59.796780 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-23 19:22:59.796786 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-23 19:22:59.796792 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-23 19:22:59.796798 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-23 19:22:59.796804 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-23 19:22:59.796822 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-23 19:22:59.796829 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-23 19:22:59.796835 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-23 19:22:59.796841 | orchestrator | changed: [testbed-node-5] => (item=/var/run/ceph) 2025-09-23 19:22:59.796847 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-23 19:22:59.796853 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-23 19:22:59.796859 | orchestrator | changed: [testbed-node-1] => (item=/var/run/ceph) 2025-09-23 19:22:59.796866 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-23 19:22:59.796871 | orchestrator | changed: [testbed-node-3] => (item=/var/run/ceph) 2025-09-23 19:22:59.796878 | orchestrator | changed: [testbed-node-5] => (item=/var/log/ceph) 2025-09-23 19:22:59.796884 | orchestrator | changed: [testbed-node-0] => (item=/var/run/ceph) 2025-09-23 19:22:59.796890 | orchestrator | changed: [testbed-node-2] => (item=/var/run/ceph) 2025-09-23 19:22:59.796896 | orchestrator | changed: [testbed-node-1] => (item=/var/log/ceph) 2025-09-23 19:22:59.796902 | orchestrator | changed: [testbed-node-4] => (item=/var/run/ceph) 2025-09-23 19:22:59.796908 | orchestrator | changed: [testbed-node-3] => (item=/var/log/ceph) 2025-09-23 19:22:59.796914 | orchestrator | changed: [testbed-node-0] => (item=/var/log/ceph) 2025-09-23 19:22:59.796920 | orchestrator | changed: [testbed-node-2] => (item=/var/log/ceph) 2025-09-23 19:22:59.796926 | orchestrator | changed: [testbed-node-4] => (item=/var/log/ceph) 2025-09-23 19:22:59.796932 | orchestrator | 2025-09-23 19:22:59.796938 | orchestrator | TASK [ceph-config : Include_tasks rgw_systemd_environment_file.yml] ************ 2025-09-23 19:22:59.796944 | orchestrator | Tuesday 23 September 2025 19:14:46 +0000 (0:00:06.346) 0:02:31.964 ***** 2025-09-23 19:22:59.796951 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.796957 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.796963 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.796969 | orchestrator | included: /ansible/roles/ceph-config/tasks/rgw_systemd_environment_file.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.796975 | orchestrator | 2025-09-23 19:22:59.796981 | orchestrator | TASK [ceph-config : Create rados gateway instance directories] ***************** 2025-09-23 19:22:59.796987 | orchestrator | Tuesday 23 September 2025 19:14:47 +0000 (0:00:01.191) 0:02:33.156 ***** 2025-09-23 19:22:59.796994 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797000 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797007 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797013 | orchestrator | 2025-09-23 19:22:59.797019 | orchestrator | TASK [ceph-config : Generate environment file] ********************************* 2025-09-23 19:22:59.797025 | orchestrator | Tuesday 23 September 2025 19:14:48 +0000 (0:00:00.807) 0:02:33.964 ***** 2025-09-23 19:22:59.797031 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797037 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797043 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797049 | orchestrator | 2025-09-23 19:22:59.797055 | orchestrator | TASK [ceph-config : Reset num_osds] ******************************************** 2025-09-23 19:22:59.797062 | orchestrator | Tuesday 23 September 2025 19:14:49 +0000 (0:00:01.398) 0:02:35.362 ***** 2025-09-23 19:22:59.797073 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.797079 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.797104 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.797110 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797117 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797123 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797129 | orchestrator | 2025-09-23 19:22:59.797135 | orchestrator | TASK [ceph-config : Count number of osds for lvm scenario] ********************* 2025-09-23 19:22:59.797141 | orchestrator | Tuesday 23 September 2025 19:14:50 +0000 (0:00:00.669) 0:02:36.031 ***** 2025-09-23 19:22:59.797147 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.797153 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.797159 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.797165 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797171 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797177 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797183 | orchestrator | 2025-09-23 19:22:59.797189 | orchestrator | TASK [ceph-config : Look up for ceph-volume rejected devices] ****************** 2025-09-23 19:22:59.797195 | orchestrator | Tuesday 23 September 2025 19:14:51 +0000 (0:00:00.975) 0:02:37.007 ***** 2025-09-23 19:22:59.797201 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797207 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797213 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797220 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797226 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797232 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797238 | orchestrator | 2025-09-23 19:22:59.797244 | orchestrator | TASK [ceph-config : Set_fact rejected_devices] ********************************* 2025-09-23 19:22:59.797250 | orchestrator | Tuesday 23 September 2025 19:14:52 +0000 (0:00:00.671) 0:02:37.678 ***** 2025-09-23 19:22:59.797267 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797273 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797279 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797286 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797292 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797298 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797304 | orchestrator | 2025-09-23 19:22:59.797310 | orchestrator | TASK [ceph-config : Set_fact _devices] ***************************************** 2025-09-23 19:22:59.797316 | orchestrator | Tuesday 23 September 2025 19:14:52 +0000 (0:00:00.797) 0:02:38.475 ***** 2025-09-23 19:22:59.797323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797329 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797335 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797341 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797347 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797353 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797359 | orchestrator | 2025-09-23 19:22:59.797365 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm batch --report' to see how many osds are to be created] *** 2025-09-23 19:22:59.797371 | orchestrator | Tuesday 23 September 2025 19:14:53 +0000 (0:00:00.594) 0:02:39.070 ***** 2025-09-23 19:22:59.797377 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797383 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797389 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797396 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797402 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797408 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797414 | orchestrator | 2025-09-23 19:22:59.797420 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (legacy report)] *** 2025-09-23 19:22:59.797426 | orchestrator | Tuesday 23 September 2025 19:14:54 +0000 (0:00:00.656) 0:02:39.727 ***** 2025-09-23 19:22:59.797432 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797439 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797451 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797457 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797463 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797469 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797475 | orchestrator | 2025-09-23 19:22:59.797481 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (new report)] *** 2025-09-23 19:22:59.797488 | orchestrator | Tuesday 23 September 2025 19:14:54 +0000 (0:00:00.874) 0:02:40.601 ***** 2025-09-23 19:22:59.797494 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797500 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797506 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797512 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797518 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797524 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797530 | orchestrator | 2025-09-23 19:22:59.797536 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created] *** 2025-09-23 19:22:59.797542 | orchestrator | Tuesday 23 September 2025 19:14:55 +0000 (0:00:00.616) 0:02:41.217 ***** 2025-09-23 19:22:59.797548 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797554 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797560 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797566 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.797572 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.797579 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.797585 | orchestrator | 2025-09-23 19:22:59.797591 | orchestrator | TASK [ceph-config : Set_fact num_osds (add existing osds)] ********************* 2025-09-23 19:22:59.797597 | orchestrator | Tuesday 23 September 2025 19:14:59 +0000 (0:00:03.713) 0:02:44.931 ***** 2025-09-23 19:22:59.797603 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.797609 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.797615 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.797621 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797627 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797633 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797639 | orchestrator | 2025-09-23 19:22:59.797645 | orchestrator | TASK [ceph-config : Set_fact _osd_memory_target] ******************************* 2025-09-23 19:22:59.797651 | orchestrator | Tuesday 23 September 2025 19:14:59 +0000 (0:00:00.536) 0:02:45.468 ***** 2025-09-23 19:22:59.797657 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.797664 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.797670 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797676 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.797682 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797688 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797694 | orchestrator | 2025-09-23 19:22:59.797700 | orchestrator | TASK [ceph-config : Set osd_memory_target to cluster host config] ************** 2025-09-23 19:22:59.797706 | orchestrator | Tuesday 23 September 2025 19:15:00 +0000 (0:00:00.799) 0:02:46.268 ***** 2025-09-23 19:22:59.797712 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797718 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797724 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797730 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797736 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797742 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797748 | orchestrator | 2025-09-23 19:22:59.797755 | orchestrator | TASK [ceph-config : Render rgw configs] **************************************** 2025-09-23 19:22:59.797761 | orchestrator | Tuesday 23 September 2025 19:15:01 +0000 (0:00:00.592) 0:02:46.860 ***** 2025-09-23 19:22:59.797767 | orchestrator | ok: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797773 | orchestrator | ok: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797786 | orchestrator | ok: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.797792 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797798 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797804 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797811 | orchestrator | 2025-09-23 19:22:59.797831 | orchestrator | TASK [ceph-config : Set config to cluster] ************************************* 2025-09-23 19:22:59.797842 | orchestrator | Tuesday 23 September 2025 19:15:02 +0000 (0:00:00.913) 0:02:47.774 ***** 2025-09-23 19:22:59.797854 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log'}])  2025-09-23 19:22:59.797872 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.13:8081'}])  2025-09-23 19:22:59.797885 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797896 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log'}])  2025-09-23 19:22:59.797907 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.14:8081'}])  2025-09-23 19:22:59.797917 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797927 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log'}])  2025-09-23 19:22:59.797934 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.15:8081'}])  2025-09-23 19:22:59.797940 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.797946 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.797952 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.797958 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.797965 | orchestrator | 2025-09-23 19:22:59.797971 | orchestrator | TASK [ceph-config : Set rgw configs to file] *********************************** 2025-09-23 19:22:59.797977 | orchestrator | Tuesday 23 September 2025 19:15:03 +0000 (0:00:00.843) 0:02:48.617 ***** 2025-09-23 19:22:59.797983 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.797989 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.797995 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.798001 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798007 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798118 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798131 | orchestrator | 2025-09-23 19:22:59.798138 | orchestrator | TASK [ceph-config : Create ceph conf directory] ******************************** 2025-09-23 19:22:59.798144 | orchestrator | Tuesday 23 September 2025 19:15:03 +0000 (0:00:00.681) 0:02:49.299 ***** 2025-09-23 19:22:59.798151 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798157 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.798170 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.798176 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798182 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798188 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798195 | orchestrator | 2025-09-23 19:22:59.798201 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2025-09-23 19:22:59.798207 | orchestrator | Tuesday 23 September 2025 19:15:04 +0000 (0:00:00.480) 0:02:49.780 ***** 2025-09-23 19:22:59.798213 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798220 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.798226 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.798232 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798238 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798244 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798250 | orchestrator | 2025-09-23 19:22:59.798256 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2025-09-23 19:22:59.798262 | orchestrator | Tuesday 23 September 2025 19:15:05 +0000 (0:00:00.993) 0:02:50.773 ***** 2025-09-23 19:22:59.798268 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798275 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.798281 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.798287 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798293 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798299 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798305 | orchestrator | 2025-09-23 19:22:59.798311 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2025-09-23 19:22:59.798317 | orchestrator | Tuesday 23 September 2025 19:15:05 +0000 (0:00:00.700) 0:02:51.474 ***** 2025-09-23 19:22:59.798323 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798352 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.798363 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.798369 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798375 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798381 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798387 | orchestrator | 2025-09-23 19:22:59.798394 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2025-09-23 19:22:59.798400 | orchestrator | Tuesday 23 September 2025 19:15:06 +0000 (0:00:00.794) 0:02:52.268 ***** 2025-09-23 19:22:59.798406 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.798412 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.798418 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798424 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798430 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.798436 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798442 | orchestrator | 2025-09-23 19:22:59.798449 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2025-09-23 19:22:59.798455 | orchestrator | Tuesday 23 September 2025 19:15:07 +0000 (0:00:00.903) 0:02:53.172 ***** 2025-09-23 19:22:59.798461 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.798467 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.798473 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.798479 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798485 | orchestrator | 2025-09-23 19:22:59.798491 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2025-09-23 19:22:59.798497 | orchestrator | Tuesday 23 September 2025 19:15:08 +0000 (0:00:00.540) 0:02:53.713 ***** 2025-09-23 19:22:59.798503 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.798509 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.798515 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.798521 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798535 | orchestrator | 2025-09-23 19:22:59.798546 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2025-09-23 19:22:59.798557 | orchestrator | Tuesday 23 September 2025 19:15:08 +0000 (0:00:00.567) 0:02:54.280 ***** 2025-09-23 19:22:59.798567 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.798577 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.798588 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.798598 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798610 | orchestrator | 2025-09-23 19:22:59.798620 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2025-09-23 19:22:59.798629 | orchestrator | Tuesday 23 September 2025 19:15:09 +0000 (0:00:00.767) 0:02:55.048 ***** 2025-09-23 19:22:59.798635 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.798642 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.798648 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.798654 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798660 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798666 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798672 | orchestrator | 2025-09-23 19:22:59.798678 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2025-09-23 19:22:59.798685 | orchestrator | Tuesday 23 September 2025 19:15:09 +0000 (0:00:00.536) 0:02:55.584 ***** 2025-09-23 19:22:59.798691 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-23 19:22:59.798697 | orchestrator | ok: [testbed-node-4] => (item=0) 2025-09-23 19:22:59.798703 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-23 19:22:59.798709 | orchestrator | skipping: [testbed-node-1] => (item=0)  2025-09-23 19:22:59.798715 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.798721 | orchestrator | skipping: [testbed-node-0] => (item=0)  2025-09-23 19:22:59.798727 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.798733 | orchestrator | skipping: [testbed-node-2] => (item=0)  2025-09-23 19:22:59.798739 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.798745 | orchestrator | 2025-09-23 19:22:59.798751 | orchestrator | TASK [ceph-config : Generate Ceph file] **************************************** 2025-09-23 19:22:59.798758 | orchestrator | Tuesday 23 September 2025 19:15:11 +0000 (0:00:01.682) 0:02:57.267 ***** 2025-09-23 19:22:59.798764 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.798770 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.798776 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.798782 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.798788 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.798794 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.798800 | orchestrator | 2025-09-23 19:22:59.798806 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-23 19:22:59.798812 | orchestrator | Tuesday 23 September 2025 19:15:14 +0000 (0:00:02.661) 0:02:59.928 ***** 2025-09-23 19:22:59.798819 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.798825 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.798831 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.798837 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.798843 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.798849 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.798855 | orchestrator | 2025-09-23 19:22:59.798861 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2025-09-23 19:22:59.798867 | orchestrator | Tuesday 23 September 2025 19:15:15 +0000 (0:00:01.528) 0:03:01.457 ***** 2025-09-23 19:22:59.798873 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.798879 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.798885 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.798891 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.798898 | orchestrator | 2025-09-23 19:22:59.798916 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2025-09-23 19:22:59.798927 | orchestrator | Tuesday 23 September 2025 19:15:16 +0000 (0:00:01.089) 0:03:02.547 ***** 2025-09-23 19:22:59.798936 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.798947 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.798956 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.798966 | orchestrator | 2025-09-23 19:22:59.799010 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2025-09-23 19:22:59.799023 | orchestrator | Tuesday 23 September 2025 19:15:17 +0000 (0:00:00.325) 0:03:02.872 ***** 2025-09-23 19:22:59.799034 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.799042 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.799049 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.799060 | orchestrator | 2025-09-23 19:22:59.799070 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2025-09-23 19:22:59.799124 | orchestrator | Tuesday 23 September 2025 19:15:18 +0000 (0:00:01.273) 0:03:04.145 ***** 2025-09-23 19:22:59.799141 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-23 19:22:59.799151 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-23 19:22:59.799160 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-23 19:22:59.799165 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.799171 | orchestrator | 2025-09-23 19:22:59.799176 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2025-09-23 19:22:59.799183 | orchestrator | Tuesday 23 September 2025 19:15:19 +0000 (0:00:00.882) 0:03:05.027 ***** 2025-09-23 19:22:59.799192 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.799201 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.799210 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.799218 | orchestrator | 2025-09-23 19:22:59.799227 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2025-09-23 19:22:59.799237 | orchestrator | Tuesday 23 September 2025 19:15:19 +0000 (0:00:00.283) 0:03:05.311 ***** 2025-09-23 19:22:59.799246 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.799255 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.799262 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.799268 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.799273 | orchestrator | 2025-09-23 19:22:59.799278 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2025-09-23 19:22:59.799285 | orchestrator | Tuesday 23 September 2025 19:15:20 +0000 (0:00:00.880) 0:03:06.191 ***** 2025-09-23 19:22:59.799295 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.799304 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.799312 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.799321 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799330 | orchestrator | 2025-09-23 19:22:59.799339 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2025-09-23 19:22:59.799348 | orchestrator | Tuesday 23 September 2025 19:15:20 +0000 (0:00:00.371) 0:03:06.563 ***** 2025-09-23 19:22:59.799357 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799366 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.799376 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.799383 | orchestrator | 2025-09-23 19:22:59.799393 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2025-09-23 19:22:59.799401 | orchestrator | Tuesday 23 September 2025 19:15:21 +0000 (0:00:00.405) 0:03:06.969 ***** 2025-09-23 19:22:59.799410 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799419 | orchestrator | 2025-09-23 19:22:59.799428 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2025-09-23 19:22:59.799437 | orchestrator | Tuesday 23 September 2025 19:15:21 +0000 (0:00:00.206) 0:03:07.176 ***** 2025-09-23 19:22:59.799455 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799464 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.799472 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.799481 | orchestrator | 2025-09-23 19:22:59.799490 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2025-09-23 19:22:59.799499 | orchestrator | Tuesday 23 September 2025 19:15:21 +0000 (0:00:00.281) 0:03:07.457 ***** 2025-09-23 19:22:59.799506 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799518 | orchestrator | 2025-09-23 19:22:59.799529 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2025-09-23 19:22:59.799537 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.198) 0:03:07.655 ***** 2025-09-23 19:22:59.799546 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799555 | orchestrator | 2025-09-23 19:22:59.799564 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2025-09-23 19:22:59.799573 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.235) 0:03:07.891 ***** 2025-09-23 19:22:59.799582 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799595 | orchestrator | 2025-09-23 19:22:59.799605 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2025-09-23 19:22:59.799614 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.110) 0:03:08.001 ***** 2025-09-23 19:22:59.799623 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799632 | orchestrator | 2025-09-23 19:22:59.799640 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2025-09-23 19:22:59.799649 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.189) 0:03:08.190 ***** 2025-09-23 19:22:59.799658 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799667 | orchestrator | 2025-09-23 19:22:59.799675 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2025-09-23 19:22:59.799684 | orchestrator | Tuesday 23 September 2025 19:15:22 +0000 (0:00:00.195) 0:03:08.386 ***** 2025-09-23 19:22:59.799694 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.799703 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.799712 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.799721 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799730 | orchestrator | 2025-09-23 19:22:59.799738 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2025-09-23 19:22:59.799747 | orchestrator | Tuesday 23 September 2025 19:15:23 +0000 (0:00:00.514) 0:03:08.901 ***** 2025-09-23 19:22:59.799756 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799795 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.799810 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.799819 | orchestrator | 2025-09-23 19:22:59.799828 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2025-09-23 19:22:59.799837 | orchestrator | Tuesday 23 September 2025 19:15:23 +0000 (0:00:00.458) 0:03:09.359 ***** 2025-09-23 19:22:59.799846 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799855 | orchestrator | 2025-09-23 19:22:59.799864 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2025-09-23 19:22:59.799873 | orchestrator | Tuesday 23 September 2025 19:15:23 +0000 (0:00:00.197) 0:03:09.556 ***** 2025-09-23 19:22:59.799882 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.799891 | orchestrator | 2025-09-23 19:22:59.799900 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2025-09-23 19:22:59.799909 | orchestrator | Tuesday 23 September 2025 19:15:24 +0000 (0:00:00.191) 0:03:09.747 ***** 2025-09-23 19:22:59.799918 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.799926 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.799935 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.799944 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.799954 | orchestrator | 2025-09-23 19:22:59.799970 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2025-09-23 19:22:59.799978 | orchestrator | Tuesday 23 September 2025 19:15:25 +0000 (0:00:00.917) 0:03:10.665 ***** 2025-09-23 19:22:59.799987 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.799996 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.800005 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.800014 | orchestrator | 2025-09-23 19:22:59.800023 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2025-09-23 19:22:59.800032 | orchestrator | Tuesday 23 September 2025 19:15:25 +0000 (0:00:00.302) 0:03:10.967 ***** 2025-09-23 19:22:59.800041 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.800048 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.800054 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.800059 | orchestrator | 2025-09-23 19:22:59.800064 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2025-09-23 19:22:59.800070 | orchestrator | Tuesday 23 September 2025 19:15:26 +0000 (0:00:01.078) 0:03:12.046 ***** 2025-09-23 19:22:59.800075 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.800093 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.800099 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.800104 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.800109 | orchestrator | 2025-09-23 19:22:59.800115 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2025-09-23 19:22:59.800120 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.697) 0:03:12.743 ***** 2025-09-23 19:22:59.800125 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.800131 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.800136 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.800141 | orchestrator | 2025-09-23 19:22:59.800147 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2025-09-23 19:22:59.800152 | orchestrator | Tuesday 23 September 2025 19:15:27 +0000 (0:00:00.327) 0:03:13.071 ***** 2025-09-23 19:22:59.800158 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800163 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800168 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800174 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.800179 | orchestrator | 2025-09-23 19:22:59.800184 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2025-09-23 19:22:59.800190 | orchestrator | Tuesday 23 September 2025 19:15:28 +0000 (0:00:00.865) 0:03:13.937 ***** 2025-09-23 19:22:59.800195 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.800200 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.800206 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.800211 | orchestrator | 2025-09-23 19:22:59.800216 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2025-09-23 19:22:59.800222 | orchestrator | Tuesday 23 September 2025 19:15:28 +0000 (0:00:00.361) 0:03:14.298 ***** 2025-09-23 19:22:59.800227 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.800232 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.800238 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.800243 | orchestrator | 2025-09-23 19:22:59.800248 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2025-09-23 19:22:59.800254 | orchestrator | Tuesday 23 September 2025 19:15:30 +0000 (0:00:01.712) 0:03:16.011 ***** 2025-09-23 19:22:59.800259 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.800264 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.800270 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.800275 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.800280 | orchestrator | 2025-09-23 19:22:59.800286 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2025-09-23 19:22:59.800296 | orchestrator | Tuesday 23 September 2025 19:15:30 +0000 (0:00:00.600) 0:03:16.612 ***** 2025-09-23 19:22:59.800301 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.800306 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.800312 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.800317 | orchestrator | 2025-09-23 19:22:59.800322 | orchestrator | RUNNING HANDLER [ceph-handler : Rbdmirrors handler] **************************** 2025-09-23 19:22:59.800328 | orchestrator | Tuesday 23 September 2025 19:15:31 +0000 (0:00:00.330) 0:03:16.942 ***** 2025-09-23 19:22:59.800334 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.800339 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.800344 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.800350 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800355 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800360 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800366 | orchestrator | 2025-09-23 19:22:59.800371 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2025-09-23 19:22:59.800399 | orchestrator | Tuesday 23 September 2025 19:15:31 +0000 (0:00:00.605) 0:03:17.548 ***** 2025-09-23 19:22:59.800406 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.800411 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.800416 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.800422 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.800427 | orchestrator | 2025-09-23 19:22:59.800432 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2025-09-23 19:22:59.800438 | orchestrator | Tuesday 23 September 2025 19:15:33 +0000 (0:00:01.114) 0:03:18.662 ***** 2025-09-23 19:22:59.800443 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800449 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800454 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800459 | orchestrator | 2025-09-23 19:22:59.800465 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2025-09-23 19:22:59.800470 | orchestrator | Tuesday 23 September 2025 19:15:33 +0000 (0:00:00.376) 0:03:19.039 ***** 2025-09-23 19:22:59.800475 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.800481 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.800486 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.800491 | orchestrator | 2025-09-23 19:22:59.800497 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2025-09-23 19:22:59.800502 | orchestrator | Tuesday 23 September 2025 19:15:35 +0000 (0:00:01.661) 0:03:20.700 ***** 2025-09-23 19:22:59.800507 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-23 19:22:59.800513 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-23 19:22:59.800518 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-23 19:22:59.800523 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800529 | orchestrator | 2025-09-23 19:22:59.800534 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2025-09-23 19:22:59.800539 | orchestrator | Tuesday 23 September 2025 19:15:35 +0000 (0:00:00.723) 0:03:21.424 ***** 2025-09-23 19:22:59.800545 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800550 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800555 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800561 | orchestrator | 2025-09-23 19:22:59.800566 | orchestrator | PLAY [Apply role ceph-mon] ***************************************************** 2025-09-23 19:22:59.800571 | orchestrator | 2025-09-23 19:22:59.800577 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.800582 | orchestrator | Tuesday 23 September 2025 19:15:36 +0000 (0:00:00.678) 0:03:22.102 ***** 2025-09-23 19:22:59.800588 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.800593 | orchestrator | 2025-09-23 19:22:59.800599 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.800608 | orchestrator | Tuesday 23 September 2025 19:15:37 +0000 (0:00:00.810) 0:03:22.912 ***** 2025-09-23 19:22:59.800613 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.800619 | orchestrator | 2025-09-23 19:22:59.800624 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.800630 | orchestrator | Tuesday 23 September 2025 19:15:37 +0000 (0:00:00.573) 0:03:23.486 ***** 2025-09-23 19:22:59.800635 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800640 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800646 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800651 | orchestrator | 2025-09-23 19:22:59.800656 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.800662 | orchestrator | Tuesday 23 September 2025 19:15:38 +0000 (0:00:00.752) 0:03:24.238 ***** 2025-09-23 19:22:59.800667 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800673 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800678 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800683 | orchestrator | 2025-09-23 19:22:59.800689 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.800694 | orchestrator | Tuesday 23 September 2025 19:15:39 +0000 (0:00:00.731) 0:03:24.969 ***** 2025-09-23 19:22:59.800699 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800705 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800710 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800715 | orchestrator | 2025-09-23 19:22:59.800721 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.800726 | orchestrator | Tuesday 23 September 2025 19:15:39 +0000 (0:00:00.377) 0:03:25.347 ***** 2025-09-23 19:22:59.800731 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800737 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800742 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800747 | orchestrator | 2025-09-23 19:22:59.800753 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.800758 | orchestrator | Tuesday 23 September 2025 19:15:40 +0000 (0:00:00.319) 0:03:25.667 ***** 2025-09-23 19:22:59.800763 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800769 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800774 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800779 | orchestrator | 2025-09-23 19:22:59.800784 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.800790 | orchestrator | Tuesday 23 September 2025 19:15:40 +0000 (0:00:00.831) 0:03:26.499 ***** 2025-09-23 19:22:59.800795 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800801 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800806 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800811 | orchestrator | 2025-09-23 19:22:59.800817 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.800822 | orchestrator | Tuesday 23 September 2025 19:15:41 +0000 (0:00:00.362) 0:03:26.861 ***** 2025-09-23 19:22:59.800828 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800833 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800838 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800844 | orchestrator | 2025-09-23 19:22:59.800869 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.800875 | orchestrator | Tuesday 23 September 2025 19:15:41 +0000 (0:00:00.606) 0:03:27.467 ***** 2025-09-23 19:22:59.800881 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800886 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800891 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800897 | orchestrator | 2025-09-23 19:22:59.800902 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.800908 | orchestrator | Tuesday 23 September 2025 19:15:42 +0000 (0:00:00.712) 0:03:28.180 ***** 2025-09-23 19:22:59.800917 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800922 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800928 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800933 | orchestrator | 2025-09-23 19:22:59.800939 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.800944 | orchestrator | Tuesday 23 September 2025 19:15:43 +0000 (0:00:00.737) 0:03:28.918 ***** 2025-09-23 19:22:59.800949 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.800955 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.800960 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.800965 | orchestrator | 2025-09-23 19:22:59.800971 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.800976 | orchestrator | Tuesday 23 September 2025 19:15:43 +0000 (0:00:00.337) 0:03:29.258 ***** 2025-09-23 19:22:59.800981 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.800987 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.800992 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.800997 | orchestrator | 2025-09-23 19:22:59.801003 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.801008 | orchestrator | Tuesday 23 September 2025 19:15:44 +0000 (0:00:00.472) 0:03:29.730 ***** 2025-09-23 19:22:59.801013 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801018 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801024 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801029 | orchestrator | 2025-09-23 19:22:59.801034 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.801040 | orchestrator | Tuesday 23 September 2025 19:15:44 +0000 (0:00:00.265) 0:03:29.996 ***** 2025-09-23 19:22:59.801045 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801051 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801056 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801061 | orchestrator | 2025-09-23 19:22:59.801067 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.801072 | orchestrator | Tuesday 23 September 2025 19:15:44 +0000 (0:00:00.264) 0:03:30.261 ***** 2025-09-23 19:22:59.801077 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801096 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801101 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801107 | orchestrator | 2025-09-23 19:22:59.801112 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.801118 | orchestrator | Tuesday 23 September 2025 19:15:44 +0000 (0:00:00.306) 0:03:30.567 ***** 2025-09-23 19:22:59.801123 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801128 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801134 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801139 | orchestrator | 2025-09-23 19:22:59.801145 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.801150 | orchestrator | Tuesday 23 September 2025 19:15:45 +0000 (0:00:00.532) 0:03:31.100 ***** 2025-09-23 19:22:59.801156 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801161 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801166 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801172 | orchestrator | 2025-09-23 19:22:59.801177 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.801182 | orchestrator | Tuesday 23 September 2025 19:15:45 +0000 (0:00:00.287) 0:03:31.388 ***** 2025-09-23 19:22:59.801188 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801193 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801199 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801204 | orchestrator | 2025-09-23 19:22:59.801209 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.801215 | orchestrator | Tuesday 23 September 2025 19:15:46 +0000 (0:00:00.393) 0:03:31.782 ***** 2025-09-23 19:22:59.801220 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801230 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801235 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801241 | orchestrator | 2025-09-23 19:22:59.801246 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.801252 | orchestrator | Tuesday 23 September 2025 19:15:46 +0000 (0:00:00.437) 0:03:32.219 ***** 2025-09-23 19:22:59.801257 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801262 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801268 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801273 | orchestrator | 2025-09-23 19:22:59.801278 | orchestrator | TASK [ceph-mon : Set_fact container_exec_cmd] ********************************** 2025-09-23 19:22:59.801284 | orchestrator | Tuesday 23 September 2025 19:15:48 +0000 (0:00:01.403) 0:03:33.623 ***** 2025-09-23 19:22:59.801289 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801294 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801300 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801305 | orchestrator | 2025-09-23 19:22:59.801311 | orchestrator | TASK [ceph-mon : Include deploy_monitors.yml] ********************************** 2025-09-23 19:22:59.801316 | orchestrator | Tuesday 23 September 2025 19:15:48 +0000 (0:00:00.398) 0:03:34.021 ***** 2025-09-23 19:22:59.801322 | orchestrator | included: /ansible/roles/ceph-mon/tasks/deploy_monitors.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.801327 | orchestrator | 2025-09-23 19:22:59.801332 | orchestrator | TASK [ceph-mon : Check if monitor initial keyring already exists] ************** 2025-09-23 19:22:59.801338 | orchestrator | Tuesday 23 September 2025 19:15:49 +0000 (0:00:00.703) 0:03:34.724 ***** 2025-09-23 19:22:59.801343 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801349 | orchestrator | 2025-09-23 19:22:59.801354 | orchestrator | TASK [ceph-mon : Generate monitor initial keyring] ***************************** 2025-09-23 19:22:59.801374 | orchestrator | Tuesday 23 September 2025 19:15:49 +0000 (0:00:00.272) 0:03:34.997 ***** 2025-09-23 19:22:59.801384 | orchestrator | changed: [testbed-node-0 -> localhost] 2025-09-23 19:22:59.801389 | orchestrator | 2025-09-23 19:22:59.801395 | orchestrator | TASK [ceph-mon : Set_fact _initial_mon_key_success] **************************** 2025-09-23 19:22:59.801400 | orchestrator | Tuesday 23 September 2025 19:15:50 +0000 (0:00:00.907) 0:03:35.904 ***** 2025-09-23 19:22:59.801405 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801411 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801416 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801422 | orchestrator | 2025-09-23 19:22:59.801427 | orchestrator | TASK [ceph-mon : Get initial keyring when it already exists] ******************* 2025-09-23 19:22:59.801432 | orchestrator | Tuesday 23 September 2025 19:15:50 +0000 (0:00:00.385) 0:03:36.290 ***** 2025-09-23 19:22:59.801438 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801443 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801448 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801454 | orchestrator | 2025-09-23 19:22:59.801459 | orchestrator | TASK [ceph-mon : Create monitor initial keyring] ******************************* 2025-09-23 19:22:59.801464 | orchestrator | Tuesday 23 September 2025 19:15:50 +0000 (0:00:00.309) 0:03:36.599 ***** 2025-09-23 19:22:59.801470 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801475 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.801480 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.801486 | orchestrator | 2025-09-23 19:22:59.801491 | orchestrator | TASK [ceph-mon : Copy the initial key in /etc/ceph (for containers)] *********** 2025-09-23 19:22:59.801496 | orchestrator | Tuesday 23 September 2025 19:15:52 +0000 (0:00:01.187) 0:03:37.786 ***** 2025-09-23 19:22:59.801502 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801507 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.801512 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.801518 | orchestrator | 2025-09-23 19:22:59.801523 | orchestrator | TASK [ceph-mon : Create monitor directory] ************************************* 2025-09-23 19:22:59.801528 | orchestrator | Tuesday 23 September 2025 19:15:53 +0000 (0:00:00.959) 0:03:38.745 ***** 2025-09-23 19:22:59.801534 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801544 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.801550 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.801555 | orchestrator | 2025-09-23 19:22:59.801560 | orchestrator | TASK [ceph-mon : Recursively fix ownership of monitor directory] *************** 2025-09-23 19:22:59.801566 | orchestrator | Tuesday 23 September 2025 19:15:53 +0000 (0:00:00.739) 0:03:39.485 ***** 2025-09-23 19:22:59.801571 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801577 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801582 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801587 | orchestrator | 2025-09-23 19:22:59.801593 | orchestrator | TASK [ceph-mon : Create admin keyring] ***************************************** 2025-09-23 19:22:59.801598 | orchestrator | Tuesday 23 September 2025 19:15:54 +0000 (0:00:00.621) 0:03:40.106 ***** 2025-09-23 19:22:59.801603 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801609 | orchestrator | 2025-09-23 19:22:59.801614 | orchestrator | TASK [ceph-mon : Slurp admin keyring] ****************************************** 2025-09-23 19:22:59.801619 | orchestrator | Tuesday 23 September 2025 19:15:55 +0000 (0:00:01.091) 0:03:41.197 ***** 2025-09-23 19:22:59.801625 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801630 | orchestrator | 2025-09-23 19:22:59.801635 | orchestrator | TASK [ceph-mon : Copy admin keyring over to mons] ****************************** 2025-09-23 19:22:59.801641 | orchestrator | Tuesday 23 September 2025 19:15:56 +0000 (0:00:00.641) 0:03:41.839 ***** 2025-09-23 19:22:59.801646 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-23 19:22:59.801652 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.801657 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.801662 | orchestrator | changed: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:22:59.801668 | orchestrator | ok: [testbed-node-1] => (item=None) 2025-09-23 19:22:59.801673 | orchestrator | ok: [testbed-node-2 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:22:59.801678 | orchestrator | changed: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:22:59.801684 | orchestrator | changed: [testbed-node-0 -> {{ item }}] 2025-09-23 19:22:59.801689 | orchestrator | ok: [testbed-node-2] => (item=None) 2025-09-23 19:22:59.801694 | orchestrator | ok: [testbed-node-2 -> {{ item }}] 2025-09-23 19:22:59.801700 | orchestrator | ok: [testbed-node-1 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:22:59.801705 | orchestrator | ok: [testbed-node-1 -> {{ item }}] 2025-09-23 19:22:59.801711 | orchestrator | 2025-09-23 19:22:59.801716 | orchestrator | TASK [ceph-mon : Import admin keyring into mon keyring] ************************ 2025-09-23 19:22:59.801721 | orchestrator | Tuesday 23 September 2025 19:15:59 +0000 (0:00:03.660) 0:03:45.499 ***** 2025-09-23 19:22:59.801727 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801732 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.801737 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.801743 | orchestrator | 2025-09-23 19:22:59.801748 | orchestrator | TASK [ceph-mon : Set_fact ceph-mon container command] ************************** 2025-09-23 19:22:59.801753 | orchestrator | Tuesday 23 September 2025 19:16:01 +0000 (0:00:01.495) 0:03:46.995 ***** 2025-09-23 19:22:59.801759 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801764 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801769 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801775 | orchestrator | 2025-09-23 19:22:59.801780 | orchestrator | TASK [ceph-mon : Set_fact monmaptool container command] ************************ 2025-09-23 19:22:59.801785 | orchestrator | Tuesday 23 September 2025 19:16:01 +0000 (0:00:00.304) 0:03:47.300 ***** 2025-09-23 19:22:59.801791 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.801796 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.801801 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.801807 | orchestrator | 2025-09-23 19:22:59.801812 | orchestrator | TASK [ceph-mon : Generate initial monmap] ************************************** 2025-09-23 19:22:59.801818 | orchestrator | Tuesday 23 September 2025 19:16:02 +0000 (0:00:00.343) 0:03:47.644 ***** 2025-09-23 19:22:59.801828 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801833 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.801839 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.801844 | orchestrator | 2025-09-23 19:22:59.801867 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs with keyring] ******************************* 2025-09-23 19:22:59.801874 | orchestrator | Tuesday 23 September 2025 19:16:03 +0000 (0:00:01.594) 0:03:49.239 ***** 2025-09-23 19:22:59.801879 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.801885 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.801890 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.801895 | orchestrator | 2025-09-23 19:22:59.801900 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs without keyring] **************************** 2025-09-23 19:22:59.801906 | orchestrator | Tuesday 23 September 2025 19:16:05 +0000 (0:00:01.398) 0:03:50.638 ***** 2025-09-23 19:22:59.801911 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801917 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801922 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801927 | orchestrator | 2025-09-23 19:22:59.801933 | orchestrator | TASK [ceph-mon : Include start_monitor.yml] ************************************ 2025-09-23 19:22:59.801938 | orchestrator | Tuesday 23 September 2025 19:16:05 +0000 (0:00:00.361) 0:03:50.999 ***** 2025-09-23 19:22:59.801943 | orchestrator | included: /ansible/roles/ceph-mon/tasks/start_monitor.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.801949 | orchestrator | 2025-09-23 19:22:59.801954 | orchestrator | TASK [ceph-mon : Ensure systemd service override directory exists] ************* 2025-09-23 19:22:59.801959 | orchestrator | Tuesday 23 September 2025 19:16:05 +0000 (0:00:00.501) 0:03:51.501 ***** 2025-09-23 19:22:59.801965 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.801970 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.801975 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.801981 | orchestrator | 2025-09-23 19:22:59.801986 | orchestrator | TASK [ceph-mon : Add ceph-mon systemd service overrides] *********************** 2025-09-23 19:22:59.801991 | orchestrator | Tuesday 23 September 2025 19:16:06 +0000 (0:00:00.532) 0:03:52.034 ***** 2025-09-23 19:22:59.801997 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802002 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802007 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802034 | orchestrator | 2025-09-23 19:22:59.802041 | orchestrator | TASK [ceph-mon : Include_tasks systemd.yml] ************************************ 2025-09-23 19:22:59.802047 | orchestrator | Tuesday 23 September 2025 19:16:06 +0000 (0:00:00.299) 0:03:52.333 ***** 2025-09-23 19:22:59.802052 | orchestrator | included: /ansible/roles/ceph-mon/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.802058 | orchestrator | 2025-09-23 19:22:59.802063 | orchestrator | TASK [ceph-mon : Generate systemd unit file for mon container] ***************** 2025-09-23 19:22:59.802068 | orchestrator | Tuesday 23 September 2025 19:16:07 +0000 (0:00:00.533) 0:03:52.867 ***** 2025-09-23 19:22:59.802074 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.802079 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.802095 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.802101 | orchestrator | 2025-09-23 19:22:59.802106 | orchestrator | TASK [ceph-mon : Generate systemd ceph-mon target file] ************************ 2025-09-23 19:22:59.802111 | orchestrator | Tuesday 23 September 2025 19:16:09 +0000 (0:00:02.472) 0:03:55.340 ***** 2025-09-23 19:22:59.802117 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.802122 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.802127 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.802133 | orchestrator | 2025-09-23 19:22:59.802138 | orchestrator | TASK [ceph-mon : Enable ceph-mon.target] *************************************** 2025-09-23 19:22:59.802143 | orchestrator | Tuesday 23 September 2025 19:16:10 +0000 (0:00:01.212) 0:03:56.552 ***** 2025-09-23 19:22:59.802149 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.802154 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.802164 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.802169 | orchestrator | 2025-09-23 19:22:59.802175 | orchestrator | TASK [ceph-mon : Start the monitor service] ************************************ 2025-09-23 19:22:59.802180 | orchestrator | Tuesday 23 September 2025 19:16:12 +0000 (0:00:01.777) 0:03:58.329 ***** 2025-09-23 19:22:59.802185 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.802191 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.802196 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.802201 | orchestrator | 2025-09-23 19:22:59.802207 | orchestrator | TASK [ceph-mon : Include_tasks ceph_keys.yml] ********************************** 2025-09-23 19:22:59.802212 | orchestrator | Tuesday 23 September 2025 19:16:14 +0000 (0:00:01.901) 0:04:00.231 ***** 2025-09-23 19:22:59.802217 | orchestrator | included: /ansible/roles/ceph-mon/tasks/ceph_keys.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.802223 | orchestrator | 2025-09-23 19:22:59.802228 | orchestrator | TASK [ceph-mon : Waiting for the monitor(s) to form the quorum...] ************* 2025-09-23 19:22:59.802233 | orchestrator | Tuesday 23 September 2025 19:16:15 +0000 (0:00:00.788) 0:04:01.020 ***** 2025-09-23 19:22:59.802239 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Waiting for the monitor(s) to form the quorum... (10 retries left). 2025-09-23 19:22:59.802244 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802250 | orchestrator | 2025-09-23 19:22:59.802255 | orchestrator | TASK [ceph-mon : Fetch ceph initial keys] ************************************** 2025-09-23 19:22:59.802260 | orchestrator | Tuesday 23 September 2025 19:16:37 +0000 (0:00:22.009) 0:04:23.029 ***** 2025-09-23 19:22:59.802266 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802271 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.802276 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.802282 | orchestrator | 2025-09-23 19:22:59.802287 | orchestrator | TASK [ceph-mon : Include secure_cluster.yml] *********************************** 2025-09-23 19:22:59.802293 | orchestrator | Tuesday 23 September 2025 19:16:46 +0000 (0:00:09.071) 0:04:32.101 ***** 2025-09-23 19:22:59.802298 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802303 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802309 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802314 | orchestrator | 2025-09-23 19:22:59.802319 | orchestrator | TASK [ceph-mon : Set cluster configs] ****************************************** 2025-09-23 19:22:59.802325 | orchestrator | Tuesday 23 September 2025 19:16:46 +0000 (0:00:00.286) 0:04:32.388 ***** 2025-09-23 19:22:59.802351 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}}, {'key': 'public_network', 'value': '192.168.16.0/20'}]) 2025-09-23 19:22:59.802359 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}}, {'key': 'cluster_network', 'value': '192.168.16.0/20'}]) 2025-09-23 19:22:59.802366 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}}, {'key': 'osd_pool_default_crush_rule', 'value': -1}]) 2025-09-23 19:22:59.802373 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}}, {'key': 'ms_bind_ipv6', 'value': 'False'}]) 2025-09-23 19:22:59.802384 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}}, {'key': 'ms_bind_ipv4', 'value': 'True'}]) 2025-09-23 19:22:59.802391 | orchestrator | skipping: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}}, {'key': 'osd_crush_chooseleaf_type', 'value': '__omit_place_holder__ec33e2fb03c9128625ba3cc6c87dfdf350949bcc'}])  2025-09-23 19:22:59.802398 | orchestrator | 2025-09-23 19:22:59.802403 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-23 19:22:59.802409 | orchestrator | Tuesday 23 September 2025 19:17:01 +0000 (0:00:14.898) 0:04:47.287 ***** 2025-09-23 19:22:59.802414 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802420 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802425 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802430 | orchestrator | 2025-09-23 19:22:59.802436 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2025-09-23 19:22:59.802441 | orchestrator | Tuesday 23 September 2025 19:17:02 +0000 (0:00:00.362) 0:04:47.649 ***** 2025-09-23 19:22:59.802446 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.802452 | orchestrator | 2025-09-23 19:22:59.802457 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2025-09-23 19:22:59.802462 | orchestrator | Tuesday 23 September 2025 19:17:02 +0000 (0:00:00.747) 0:04:48.397 ***** 2025-09-23 19:22:59.802468 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802473 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.802478 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.802484 | orchestrator | 2025-09-23 19:22:59.802489 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2025-09-23 19:22:59.802495 | orchestrator | Tuesday 23 September 2025 19:17:03 +0000 (0:00:00.392) 0:04:48.789 ***** 2025-09-23 19:22:59.802500 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802505 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802511 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802516 | orchestrator | 2025-09-23 19:22:59.802521 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2025-09-23 19:22:59.802527 | orchestrator | Tuesday 23 September 2025 19:17:03 +0000 (0:00:00.340) 0:04:49.130 ***** 2025-09-23 19:22:59.802532 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-23 19:22:59.802537 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-23 19:22:59.802543 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-23 19:22:59.802548 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802553 | orchestrator | 2025-09-23 19:22:59.802559 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2025-09-23 19:22:59.802564 | orchestrator | Tuesday 23 September 2025 19:17:04 +0000 (0:00:00.624) 0:04:49.755 ***** 2025-09-23 19:22:59.802569 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802575 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.802580 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.802586 | orchestrator | 2025-09-23 19:22:59.802605 | orchestrator | PLAY [Apply role ceph-mgr] ***************************************************** 2025-09-23 19:22:59.802611 | orchestrator | 2025-09-23 19:22:59.802620 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.802625 | orchestrator | Tuesday 23 September 2025 19:17:04 +0000 (0:00:00.773) 0:04:50.529 ***** 2025-09-23 19:22:59.802635 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.802641 | orchestrator | 2025-09-23 19:22:59.802646 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.802652 | orchestrator | Tuesday 23 September 2025 19:17:05 +0000 (0:00:00.524) 0:04:51.053 ***** 2025-09-23 19:22:59.802657 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.802663 | orchestrator | 2025-09-23 19:22:59.802668 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.802673 | orchestrator | Tuesday 23 September 2025 19:17:06 +0000 (0:00:00.739) 0:04:51.793 ***** 2025-09-23 19:22:59.802679 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802684 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.802689 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.802695 | orchestrator | 2025-09-23 19:22:59.802700 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.802705 | orchestrator | Tuesday 23 September 2025 19:17:06 +0000 (0:00:00.703) 0:04:52.496 ***** 2025-09-23 19:22:59.802711 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802716 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802721 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802727 | orchestrator | 2025-09-23 19:22:59.802732 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.802737 | orchestrator | Tuesday 23 September 2025 19:17:07 +0000 (0:00:00.335) 0:04:52.832 ***** 2025-09-23 19:22:59.802743 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802748 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802755 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802764 | orchestrator | 2025-09-23 19:22:59.802773 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.802782 | orchestrator | Tuesday 23 September 2025 19:17:07 +0000 (0:00:00.296) 0:04:53.129 ***** 2025-09-23 19:22:59.802790 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802798 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802811 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802821 | orchestrator | 2025-09-23 19:22:59.802830 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.802839 | orchestrator | Tuesday 23 September 2025 19:17:07 +0000 (0:00:00.342) 0:04:53.472 ***** 2025-09-23 19:22:59.802848 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802857 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.802866 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.802874 | orchestrator | 2025-09-23 19:22:59.802882 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.802887 | orchestrator | Tuesday 23 September 2025 19:17:08 +0000 (0:00:01.019) 0:04:54.491 ***** 2025-09-23 19:22:59.802893 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802898 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802903 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802909 | orchestrator | 2025-09-23 19:22:59.802914 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.802920 | orchestrator | Tuesday 23 September 2025 19:17:09 +0000 (0:00:00.337) 0:04:54.828 ***** 2025-09-23 19:22:59.802925 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.802930 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.802936 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.802941 | orchestrator | 2025-09-23 19:22:59.802946 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.802952 | orchestrator | Tuesday 23 September 2025 19:17:09 +0000 (0:00:00.364) 0:04:55.193 ***** 2025-09-23 19:22:59.802957 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.802962 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.802975 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.802980 | orchestrator | 2025-09-23 19:22:59.802986 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.802991 | orchestrator | Tuesday 23 September 2025 19:17:10 +0000 (0:00:00.827) 0:04:56.020 ***** 2025-09-23 19:22:59.802996 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.803002 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.803007 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.803012 | orchestrator | 2025-09-23 19:22:59.803018 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.803023 | orchestrator | Tuesday 23 September 2025 19:17:11 +0000 (0:00:00.989) 0:04:57.010 ***** 2025-09-23 19:22:59.803028 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803034 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803039 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803044 | orchestrator | 2025-09-23 19:22:59.803050 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.803055 | orchestrator | Tuesday 23 September 2025 19:17:11 +0000 (0:00:00.313) 0:04:57.324 ***** 2025-09-23 19:22:59.803060 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.803069 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.803078 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.803133 | orchestrator | 2025-09-23 19:22:59.803146 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.803155 | orchestrator | Tuesday 23 September 2025 19:17:12 +0000 (0:00:00.334) 0:04:57.659 ***** 2025-09-23 19:22:59.803165 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803173 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803182 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803190 | orchestrator | 2025-09-23 19:22:59.803200 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.803208 | orchestrator | Tuesday 23 September 2025 19:17:12 +0000 (0:00:00.297) 0:04:57.957 ***** 2025-09-23 19:22:59.803217 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803231 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803270 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803283 | orchestrator | 2025-09-23 19:22:59.803294 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.803304 | orchestrator | Tuesday 23 September 2025 19:17:12 +0000 (0:00:00.522) 0:04:58.479 ***** 2025-09-23 19:22:59.803313 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803322 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803334 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803345 | orchestrator | 2025-09-23 19:22:59.803359 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.803370 | orchestrator | Tuesday 23 September 2025 19:17:13 +0000 (0:00:00.296) 0:04:58.776 ***** 2025-09-23 19:22:59.803378 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803387 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803397 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803403 | orchestrator | 2025-09-23 19:22:59.803412 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.803421 | orchestrator | Tuesday 23 September 2025 19:17:13 +0000 (0:00:00.307) 0:04:59.084 ***** 2025-09-23 19:22:59.803430 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803443 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803456 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803469 | orchestrator | 2025-09-23 19:22:59.803478 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.803487 | orchestrator | Tuesday 23 September 2025 19:17:13 +0000 (0:00:00.325) 0:04:59.409 ***** 2025-09-23 19:22:59.803496 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.803504 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.803512 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.803520 | orchestrator | 2025-09-23 19:22:59.803537 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.803550 | orchestrator | Tuesday 23 September 2025 19:17:14 +0000 (0:00:00.313) 0:04:59.723 ***** 2025-09-23 19:22:59.803561 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.803568 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.803576 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.803584 | orchestrator | 2025-09-23 19:22:59.803592 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.803600 | orchestrator | Tuesday 23 September 2025 19:17:14 +0000 (0:00:00.594) 0:05:00.317 ***** 2025-09-23 19:22:59.803608 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.803616 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.803628 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.803638 | orchestrator | 2025-09-23 19:22:59.803650 | orchestrator | TASK [ceph-mgr : Set_fact container_exec_cmd] ********************************** 2025-09-23 19:22:59.803659 | orchestrator | Tuesday 23 September 2025 19:17:15 +0000 (0:00:00.544) 0:05:00.862 ***** 2025-09-23 19:22:59.803667 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-23 19:22:59.803674 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:22:59.803682 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:22:59.803690 | orchestrator | 2025-09-23 19:22:59.803698 | orchestrator | TASK [ceph-mgr : Include common.yml] ******************************************* 2025-09-23 19:22:59.803706 | orchestrator | Tuesday 23 September 2025 19:17:16 +0000 (0:00:00.866) 0:05:01.729 ***** 2025-09-23 19:22:59.803716 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/common.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.803727 | orchestrator | 2025-09-23 19:22:59.803739 | orchestrator | TASK [ceph-mgr : Create mgr directory] ***************************************** 2025-09-23 19:22:59.803749 | orchestrator | Tuesday 23 September 2025 19:17:16 +0000 (0:00:00.745) 0:05:02.474 ***** 2025-09-23 19:22:59.803757 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.803765 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.803773 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.803780 | orchestrator | 2025-09-23 19:22:59.803788 | orchestrator | TASK [ceph-mgr : Fetch ceph mgr keyring] *************************************** 2025-09-23 19:22:59.803796 | orchestrator | Tuesday 23 September 2025 19:17:17 +0000 (0:00:00.768) 0:05:03.243 ***** 2025-09-23 19:22:59.803804 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.803818 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.803829 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.803838 | orchestrator | 2025-09-23 19:22:59.803846 | orchestrator | TASK [ceph-mgr : Create ceph mgr keyring(s) on a mon node] ********************* 2025-09-23 19:22:59.803854 | orchestrator | Tuesday 23 September 2025 19:17:17 +0000 (0:00:00.298) 0:05:03.542 ***** 2025-09-23 19:22:59.803861 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-23 19:22:59.803870 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-23 19:22:59.803878 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-23 19:22:59.803886 | orchestrator | changed: [testbed-node-0 -> {{ groups[mon_group_name][0] }}] 2025-09-23 19:22:59.803893 | orchestrator | 2025-09-23 19:22:59.803904 | orchestrator | TASK [ceph-mgr : Set_fact _mgr_keys] ******************************************* 2025-09-23 19:22:59.803913 | orchestrator | Tuesday 23 September 2025 19:17:28 +0000 (0:00:10.746) 0:05:14.288 ***** 2025-09-23 19:22:59.803921 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.803932 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.803942 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.803954 | orchestrator | 2025-09-23 19:22:59.803962 | orchestrator | TASK [ceph-mgr : Get keys from monitors] *************************************** 2025-09-23 19:22:59.803970 | orchestrator | Tuesday 23 September 2025 19:17:29 +0000 (0:00:00.605) 0:05:14.894 ***** 2025-09-23 19:22:59.803977 | orchestrator | skipping: [testbed-node-0] => (item=None)  2025-09-23 19:22:59.803984 | orchestrator | skipping: [testbed-node-1] => (item=None)  2025-09-23 19:22:59.804002 | orchestrator | skipping: [testbed-node-2] => (item=None)  2025-09-23 19:22:59.804009 | orchestrator | ok: [testbed-node-0] => (item=None) 2025-09-23 19:22:59.804017 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.804025 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.804038 | orchestrator | 2025-09-23 19:22:59.804074 | orchestrator | TASK [ceph-mgr : Copy ceph key(s) if needed] *********************************** 2025-09-23 19:22:59.804103 | orchestrator | Tuesday 23 September 2025 19:17:31 +0000 (0:00:02.084) 0:05:16.979 ***** 2025-09-23 19:22:59.804114 | orchestrator | skipping: [testbed-node-0] => (item=None)  2025-09-23 19:22:59.804125 | orchestrator | skipping: [testbed-node-1] => (item=None)  2025-09-23 19:22:59.804134 | orchestrator | skipping: [testbed-node-2] => (item=None)  2025-09-23 19:22:59.804141 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-23 19:22:59.804149 | orchestrator | changed: [testbed-node-1] => (item=None) 2025-09-23 19:22:59.804157 | orchestrator | changed: [testbed-node-2] => (item=None) 2025-09-23 19:22:59.804165 | orchestrator | 2025-09-23 19:22:59.804173 | orchestrator | TASK [ceph-mgr : Set mgr key permissions] ************************************** 2025-09-23 19:22:59.804181 | orchestrator | Tuesday 23 September 2025 19:17:32 +0000 (0:00:01.171) 0:05:18.150 ***** 2025-09-23 19:22:59.804188 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.804200 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.804212 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.804222 | orchestrator | 2025-09-23 19:22:59.804230 | orchestrator | TASK [ceph-mgr : Append dashboard modules to ceph_mgr_modules] ***************** 2025-09-23 19:22:59.804238 | orchestrator | Tuesday 23 September 2025 19:17:33 +0000 (0:00:00.711) 0:05:18.861 ***** 2025-09-23 19:22:59.804246 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.804254 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.804262 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.804270 | orchestrator | 2025-09-23 19:22:59.804278 | orchestrator | TASK [ceph-mgr : Include pre_requisite.yml] ************************************ 2025-09-23 19:22:59.804286 | orchestrator | Tuesday 23 September 2025 19:17:33 +0000 (0:00:00.540) 0:05:19.402 ***** 2025-09-23 19:22:59.804294 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.804301 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.804309 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.804319 | orchestrator | 2025-09-23 19:22:59.804330 | orchestrator | TASK [ceph-mgr : Include start_mgr.yml] **************************************** 2025-09-23 19:22:59.804338 | orchestrator | Tuesday 23 September 2025 19:17:34 +0000 (0:00:00.322) 0:05:19.725 ***** 2025-09-23 19:22:59.804346 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/start_mgr.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.804354 | orchestrator | 2025-09-23 19:22:59.804361 | orchestrator | TASK [ceph-mgr : Ensure systemd service override directory exists] ************* 2025-09-23 19:22:59.804369 | orchestrator | Tuesday 23 September 2025 19:17:34 +0000 (0:00:00.523) 0:05:20.248 ***** 2025-09-23 19:22:59.804377 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.804385 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.804392 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.804401 | orchestrator | 2025-09-23 19:22:59.804408 | orchestrator | TASK [ceph-mgr : Add ceph-mgr systemd service overrides] *********************** 2025-09-23 19:22:59.804416 | orchestrator | Tuesday 23 September 2025 19:17:34 +0000 (0:00:00.306) 0:05:20.554 ***** 2025-09-23 19:22:59.804424 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.804432 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.804440 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.804448 | orchestrator | 2025-09-23 19:22:59.804456 | orchestrator | TASK [ceph-mgr : Include_tasks systemd.yml] ************************************ 2025-09-23 19:22:59.804463 | orchestrator | Tuesday 23 September 2025 19:17:35 +0000 (0:00:00.636) 0:05:21.191 ***** 2025-09-23 19:22:59.804472 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.804486 | orchestrator | 2025-09-23 19:22:59.804494 | orchestrator | TASK [ceph-mgr : Generate systemd unit file] *********************************** 2025-09-23 19:22:59.804501 | orchestrator | Tuesday 23 September 2025 19:17:36 +0000 (0:00:00.528) 0:05:21.720 ***** 2025-09-23 19:22:59.804509 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.804517 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.804525 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.804533 | orchestrator | 2025-09-23 19:22:59.804541 | orchestrator | TASK [ceph-mgr : Generate systemd ceph-mgr target file] ************************ 2025-09-23 19:22:59.804549 | orchestrator | Tuesday 23 September 2025 19:17:37 +0000 (0:00:01.160) 0:05:22.880 ***** 2025-09-23 19:22:59.804557 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.804565 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.804573 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.804580 | orchestrator | 2025-09-23 19:22:59.804588 | orchestrator | TASK [ceph-mgr : Enable ceph-mgr.target] *************************************** 2025-09-23 19:22:59.804597 | orchestrator | Tuesday 23 September 2025 19:17:38 +0000 (0:00:01.390) 0:05:24.270 ***** 2025-09-23 19:22:59.804605 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.804612 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.804620 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.804628 | orchestrator | 2025-09-23 19:22:59.804636 | orchestrator | TASK [ceph-mgr : Systemd start mgr] ******************************************** 2025-09-23 19:22:59.804644 | orchestrator | Tuesday 23 September 2025 19:17:40 +0000 (0:00:01.725) 0:05:25.996 ***** 2025-09-23 19:22:59.804652 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.804659 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.804667 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.804675 | orchestrator | 2025-09-23 19:22:59.804683 | orchestrator | TASK [ceph-mgr : Include mgr_modules.yml] ************************************** 2025-09-23 19:22:59.804691 | orchestrator | Tuesday 23 September 2025 19:17:42 +0000 (0:00:02.022) 0:05:28.018 ***** 2025-09-23 19:22:59.804699 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.804707 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.804715 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/mgr_modules.yml for testbed-node-2 2025-09-23 19:22:59.804722 | orchestrator | 2025-09-23 19:22:59.804730 | orchestrator | TASK [ceph-mgr : Wait for all mgr to be up] ************************************ 2025-09-23 19:22:59.804738 | orchestrator | Tuesday 23 September 2025 19:17:42 +0000 (0:00:00.462) 0:05:28.480 ***** 2025-09-23 19:22:59.804746 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (30 retries left). 2025-09-23 19:22:59.804781 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (29 retries left). 2025-09-23 19:22:59.804790 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (28 retries left). 2025-09-23 19:22:59.804798 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (27 retries left). 2025-09-23 19:22:59.804806 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (26 retries left). 2025-09-23 19:22:59.804814 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (25 retries left). 2025-09-23 19:22:59.804821 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.804829 | orchestrator | 2025-09-23 19:22:59.804837 | orchestrator | TASK [ceph-mgr : Get enabled modules from ceph-mgr] **************************** 2025-09-23 19:22:59.804845 | orchestrator | Tuesday 23 September 2025 19:18:19 +0000 (0:00:36.897) 0:06:05.378 ***** 2025-09-23 19:22:59.804853 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.804861 | orchestrator | 2025-09-23 19:22:59.804869 | orchestrator | TASK [ceph-mgr : Set _ceph_mgr_modules fact (convert _ceph_mgr_modules.stdout to a dict)] *** 2025-09-23 19:22:59.804877 | orchestrator | Tuesday 23 September 2025 19:18:21 +0000 (0:00:01.353) 0:06:06.731 ***** 2025-09-23 19:22:59.804891 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.804899 | orchestrator | 2025-09-23 19:22:59.804906 | orchestrator | TASK [ceph-mgr : Set _disabled_ceph_mgr_modules fact] ************************** 2025-09-23 19:22:59.804914 | orchestrator | Tuesday 23 September 2025 19:18:21 +0000 (0:00:00.324) 0:06:07.056 ***** 2025-09-23 19:22:59.804922 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.804931 | orchestrator | 2025-09-23 19:22:59.804939 | orchestrator | TASK [ceph-mgr : Disable ceph mgr enabled modules] ***************************** 2025-09-23 19:22:59.804946 | orchestrator | Tuesday 23 September 2025 19:18:21 +0000 (0:00:00.165) 0:06:07.221 ***** 2025-09-23 19:22:59.804954 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=iostat) 2025-09-23 19:22:59.804962 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=nfs) 2025-09-23 19:22:59.804970 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=restful) 2025-09-23 19:22:59.804978 | orchestrator | 2025-09-23 19:22:59.804987 | orchestrator | TASK [ceph-mgr : Add modules to ceph-mgr] ************************************** 2025-09-23 19:22:59.804998 | orchestrator | Tuesday 23 September 2025 19:18:28 +0000 (0:00:06.451) 0:06:13.672 ***** 2025-09-23 19:22:59.805005 | orchestrator | skipping: [testbed-node-2] => (item=balancer)  2025-09-23 19:22:59.805012 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=dashboard) 2025-09-23 19:22:59.805019 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=prometheus) 2025-09-23 19:22:59.805026 | orchestrator | skipping: [testbed-node-2] => (item=status)  2025-09-23 19:22:59.805033 | orchestrator | 2025-09-23 19:22:59.805041 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-23 19:22:59.805048 | orchestrator | Tuesday 23 September 2025 19:18:32 +0000 (0:00:04.708) 0:06:18.380 ***** 2025-09-23 19:22:59.805055 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.805063 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.805070 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.805079 | orchestrator | 2025-09-23 19:22:59.805133 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2025-09-23 19:22:59.805142 | orchestrator | Tuesday 23 September 2025 19:18:33 +0000 (0:00:00.990) 0:06:19.371 ***** 2025-09-23 19:22:59.805150 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.805157 | orchestrator | 2025-09-23 19:22:59.805164 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2025-09-23 19:22:59.805171 | orchestrator | Tuesday 23 September 2025 19:18:34 +0000 (0:00:00.561) 0:06:19.932 ***** 2025-09-23 19:22:59.805178 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.805186 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.805193 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.805200 | orchestrator | 2025-09-23 19:22:59.805209 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2025-09-23 19:22:59.805216 | orchestrator | Tuesday 23 September 2025 19:18:34 +0000 (0:00:00.320) 0:06:20.253 ***** 2025-09-23 19:22:59.805223 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.805229 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.805237 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.805244 | orchestrator | 2025-09-23 19:22:59.805251 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2025-09-23 19:22:59.805258 | orchestrator | Tuesday 23 September 2025 19:18:36 +0000 (0:00:01.469) 0:06:21.722 ***** 2025-09-23 19:22:59.805266 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-23 19:22:59.805274 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-23 19:22:59.805281 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-23 19:22:59.805289 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.805296 | orchestrator | 2025-09-23 19:22:59.805304 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2025-09-23 19:22:59.805319 | orchestrator | Tuesday 23 September 2025 19:18:36 +0000 (0:00:00.663) 0:06:22.385 ***** 2025-09-23 19:22:59.805326 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.805333 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.805340 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.805347 | orchestrator | 2025-09-23 19:22:59.805355 | orchestrator | PLAY [Apply role ceph-osd] ***************************************************** 2025-09-23 19:22:59.805363 | orchestrator | 2025-09-23 19:22:59.805370 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.805378 | orchestrator | Tuesday 23 September 2025 19:18:37 +0000 (0:00:00.529) 0:06:22.915 ***** 2025-09-23 19:22:59.805428 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.805438 | orchestrator | 2025-09-23 19:22:59.805445 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.805453 | orchestrator | Tuesday 23 September 2025 19:18:38 +0000 (0:00:00.719) 0:06:23.634 ***** 2025-09-23 19:22:59.805460 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.805467 | orchestrator | 2025-09-23 19:22:59.805475 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.805483 | orchestrator | Tuesday 23 September 2025 19:18:38 +0000 (0:00:00.607) 0:06:24.241 ***** 2025-09-23 19:22:59.805491 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.805499 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.805507 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.805515 | orchestrator | 2025-09-23 19:22:59.805523 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.805530 | orchestrator | Tuesday 23 September 2025 19:18:38 +0000 (0:00:00.341) 0:06:24.582 ***** 2025-09-23 19:22:59.805538 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.805546 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.805553 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.805561 | orchestrator | 2025-09-23 19:22:59.805568 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.805576 | orchestrator | Tuesday 23 September 2025 19:18:40 +0000 (0:00:01.161) 0:06:25.744 ***** 2025-09-23 19:22:59.805584 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.805593 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.805601 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.805609 | orchestrator | 2025-09-23 19:22:59.805617 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.805624 | orchestrator | Tuesday 23 September 2025 19:18:40 +0000 (0:00:00.781) 0:06:26.526 ***** 2025-09-23 19:22:59.805631 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.805638 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.805646 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.805654 | orchestrator | 2025-09-23 19:22:59.805660 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.805667 | orchestrator | Tuesday 23 September 2025 19:18:41 +0000 (0:00:00.758) 0:06:27.284 ***** 2025-09-23 19:22:59.805674 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.805681 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.805688 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.805694 | orchestrator | 2025-09-23 19:22:59.805702 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.805708 | orchestrator | Tuesday 23 September 2025 19:18:42 +0000 (0:00:00.346) 0:06:27.631 ***** 2025-09-23 19:22:59.805715 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.805722 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.805729 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.805736 | orchestrator | 2025-09-23 19:22:59.805743 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.805758 | orchestrator | Tuesday 23 September 2025 19:18:42 +0000 (0:00:00.586) 0:06:28.218 ***** 2025-09-23 19:22:59.805765 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.805772 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.805778 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.805784 | orchestrator | 2025-09-23 19:22:59.805791 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.805797 | orchestrator | Tuesday 23 September 2025 19:18:42 +0000 (0:00:00.315) 0:06:28.533 ***** 2025-09-23 19:22:59.805803 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.805810 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.805817 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.805824 | orchestrator | 2025-09-23 19:22:59.805831 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.805838 | orchestrator | Tuesday 23 September 2025 19:18:43 +0000 (0:00:00.722) 0:06:29.255 ***** 2025-09-23 19:22:59.805845 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.805852 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.805858 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.805865 | orchestrator | 2025-09-23 19:22:59.805871 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.805878 | orchestrator | Tuesday 23 September 2025 19:18:44 +0000 (0:00:00.879) 0:06:30.135 ***** 2025-09-23 19:22:59.805885 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.805892 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.805899 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.805905 | orchestrator | 2025-09-23 19:22:59.805912 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.805919 | orchestrator | Tuesday 23 September 2025 19:18:45 +0000 (0:00:00.568) 0:06:30.704 ***** 2025-09-23 19:22:59.805925 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.805932 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.805939 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.805946 | orchestrator | 2025-09-23 19:22:59.805953 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.805961 | orchestrator | Tuesday 23 September 2025 19:18:45 +0000 (0:00:00.351) 0:06:31.055 ***** 2025-09-23 19:22:59.805968 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.805976 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.805983 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.805994 | orchestrator | 2025-09-23 19:22:59.806004 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.806011 | orchestrator | Tuesday 23 September 2025 19:18:45 +0000 (0:00:00.408) 0:06:31.464 ***** 2025-09-23 19:22:59.806045 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806053 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806061 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806068 | orchestrator | 2025-09-23 19:22:59.806076 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.806101 | orchestrator | Tuesday 23 September 2025 19:18:46 +0000 (0:00:00.315) 0:06:31.779 ***** 2025-09-23 19:22:59.806108 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806117 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806140 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806148 | orchestrator | 2025-09-23 19:22:59.806154 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.806159 | orchestrator | Tuesday 23 September 2025 19:18:46 +0000 (0:00:00.600) 0:06:32.380 ***** 2025-09-23 19:22:59.806163 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.806168 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.806172 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.806177 | orchestrator | 2025-09-23 19:22:59.806181 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.806185 | orchestrator | Tuesday 23 September 2025 19:18:47 +0000 (0:00:00.281) 0:06:32.662 ***** 2025-09-23 19:22:59.806197 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.806201 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.806206 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.806210 | orchestrator | 2025-09-23 19:22:59.806215 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.806219 | orchestrator | Tuesday 23 September 2025 19:18:47 +0000 (0:00:00.272) 0:06:32.934 ***** 2025-09-23 19:22:59.806224 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.806228 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.806233 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.806237 | orchestrator | 2025-09-23 19:22:59.806242 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.806246 | orchestrator | Tuesday 23 September 2025 19:18:47 +0000 (0:00:00.284) 0:06:33.219 ***** 2025-09-23 19:22:59.806251 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806255 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806260 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806264 | orchestrator | 2025-09-23 19:22:59.806269 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.806273 | orchestrator | Tuesday 23 September 2025 19:18:48 +0000 (0:00:00.513) 0:06:33.732 ***** 2025-09-23 19:22:59.806278 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806282 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806287 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806291 | orchestrator | 2025-09-23 19:22:59.806296 | orchestrator | TASK [ceph-osd : Set_fact add_osd] ********************************************* 2025-09-23 19:22:59.806300 | orchestrator | Tuesday 23 September 2025 19:18:48 +0000 (0:00:00.478) 0:06:34.211 ***** 2025-09-23 19:22:59.806304 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806309 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806313 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806318 | orchestrator | 2025-09-23 19:22:59.806322 | orchestrator | TASK [ceph-osd : Set_fact container_exec_cmd] ********************************** 2025-09-23 19:22:59.806327 | orchestrator | Tuesday 23 September 2025 19:18:48 +0000 (0:00:00.285) 0:06:34.497 ***** 2025-09-23 19:22:59.806332 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:22:59.806336 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:22:59.806341 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:22:59.806346 | orchestrator | 2025-09-23 19:22:59.806350 | orchestrator | TASK [ceph-osd : Include_tasks system_tuning.yml] ****************************** 2025-09-23 19:22:59.806355 | orchestrator | Tuesday 23 September 2025 19:18:49 +0000 (0:00:00.719) 0:06:35.216 ***** 2025-09-23 19:22:59.806359 | orchestrator | included: /ansible/roles/ceph-osd/tasks/system_tuning.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.806364 | orchestrator | 2025-09-23 19:22:59.806368 | orchestrator | TASK [ceph-osd : Create tmpfiles.d directory] ********************************** 2025-09-23 19:22:59.806373 | orchestrator | Tuesday 23 September 2025 19:18:50 +0000 (0:00:00.700) 0:06:35.917 ***** 2025-09-23 19:22:59.806377 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.806382 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.806386 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.806391 | orchestrator | 2025-09-23 19:22:59.806395 | orchestrator | TASK [ceph-osd : Disable transparent hugepage] ********************************* 2025-09-23 19:22:59.806400 | orchestrator | Tuesday 23 September 2025 19:18:50 +0000 (0:00:00.266) 0:06:36.183 ***** 2025-09-23 19:22:59.806404 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.806409 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.806413 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.806418 | orchestrator | 2025-09-23 19:22:59.806422 | orchestrator | TASK [ceph-osd : Get default vm.min_free_kbytes] ******************************* 2025-09-23 19:22:59.806427 | orchestrator | Tuesday 23 September 2025 19:18:51 +0000 (0:00:00.523) 0:06:36.707 ***** 2025-09-23 19:22:59.806435 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806439 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806444 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806448 | orchestrator | 2025-09-23 19:22:59.806453 | orchestrator | TASK [ceph-osd : Set_fact vm_min_free_kbytes] ********************************** 2025-09-23 19:22:59.806457 | orchestrator | Tuesday 23 September 2025 19:18:51 +0000 (0:00:00.878) 0:06:37.586 ***** 2025-09-23 19:22:59.806462 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.806466 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.806471 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.806475 | orchestrator | 2025-09-23 19:22:59.806480 | orchestrator | TASK [ceph-osd : Apply operating system tuning] ******************************** 2025-09-23 19:22:59.806484 | orchestrator | Tuesday 23 September 2025 19:18:52 +0000 (0:00:00.349) 0:06:37.935 ***** 2025-09-23 19:22:59.806489 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2025-09-23 19:22:59.806494 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2025-09-23 19:22:59.806498 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.file-max', 'value': 26234859}) 2025-09-23 19:22:59.806503 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.file-max', 'value': 26234859}) 2025-09-23 19:22:59.806507 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2025-09-23 19:22:59.806520 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2025-09-23 19:22:59.806526 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 10}) 2025-09-23 19:22:59.806533 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 10}) 2025-09-23 19:22:59.806540 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2025-09-23 19:22:59.806547 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2025-09-23 19:22:59.806553 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2025-09-23 19:22:59.806559 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.file-max', 'value': 26234859}) 2025-09-23 19:22:59.806566 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2025-09-23 19:22:59.806572 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 10}) 2025-09-23 19:22:59.806579 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2025-09-23 19:22:59.806586 | orchestrator | 2025-09-23 19:22:59.806593 | orchestrator | TASK [ceph-osd : Install dependencies] ***************************************** 2025-09-23 19:22:59.806599 | orchestrator | Tuesday 23 September 2025 19:18:55 +0000 (0:00:02.981) 0:06:40.917 ***** 2025-09-23 19:22:59.806606 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.806614 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.806622 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.806629 | orchestrator | 2025-09-23 19:22:59.806636 | orchestrator | TASK [ceph-osd : Include_tasks common.yml] ************************************* 2025-09-23 19:22:59.806643 | orchestrator | Tuesday 23 September 2025 19:18:55 +0000 (0:00:00.314) 0:06:41.231 ***** 2025-09-23 19:22:59.806652 | orchestrator | included: /ansible/roles/ceph-osd/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.806662 | orchestrator | 2025-09-23 19:22:59.806669 | orchestrator | TASK [ceph-osd : Create bootstrap-osd and osd directories] ********************* 2025-09-23 19:22:59.806676 | orchestrator | Tuesday 23 September 2025 19:18:56 +0000 (0:00:00.825) 0:06:42.057 ***** 2025-09-23 19:22:59.806683 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd/) 2025-09-23 19:22:59.806690 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd/) 2025-09-23 19:22:59.806697 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd/) 2025-09-23 19:22:59.806710 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/osd/) 2025-09-23 19:22:59.806717 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/osd/) 2025-09-23 19:22:59.806724 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/osd/) 2025-09-23 19:22:59.806731 | orchestrator | 2025-09-23 19:22:59.806738 | orchestrator | TASK [ceph-osd : Get keys from monitors] *************************************** 2025-09-23 19:22:59.806744 | orchestrator | Tuesday 23 September 2025 19:18:57 +0000 (0:00:00.954) 0:06:43.011 ***** 2025-09-23 19:22:59.806751 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.806759 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-23 19:22:59.806765 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:22:59.806773 | orchestrator | 2025-09-23 19:22:59.806781 | orchestrator | TASK [ceph-osd : Copy ceph key(s) if needed] *********************************** 2025-09-23 19:22:59.806789 | orchestrator | Tuesday 23 September 2025 19:18:59 +0000 (0:00:02.016) 0:06:45.028 ***** 2025-09-23 19:22:59.806797 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-23 19:22:59.806804 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-23 19:22:59.806812 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.806819 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-23 19:22:59.806827 | orchestrator | skipping: [testbed-node-4] => (item=None)  2025-09-23 19:22:59.806835 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.806842 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-23 19:22:59.806850 | orchestrator | skipping: [testbed-node-5] => (item=None)  2025-09-23 19:22:59.806858 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.806865 | orchestrator | 2025-09-23 19:22:59.806873 | orchestrator | TASK [ceph-osd : Set noup flag] ************************************************ 2025-09-23 19:22:59.806881 | orchestrator | Tuesday 23 September 2025 19:19:00 +0000 (0:00:01.432) 0:06:46.461 ***** 2025-09-23 19:22:59.806888 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.806896 | orchestrator | 2025-09-23 19:22:59.806903 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm.yml] ****************************** 2025-09-23 19:22:59.806909 | orchestrator | Tuesday 23 September 2025 19:19:03 +0000 (0:00:02.346) 0:06:48.808 ***** 2025-09-23 19:22:59.806916 | orchestrator | included: /ansible/roles/ceph-osd/tasks/scenarios/lvm.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.806924 | orchestrator | 2025-09-23 19:22:59.806931 | orchestrator | TASK [ceph-osd : Use ceph-volume to create osds] ******************************* 2025-09-23 19:22:59.806939 | orchestrator | Tuesday 23 September 2025 19:19:03 +0000 (0:00:00.798) 0:06:49.606 ***** 2025-09-23 19:22:59.806947 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e', 'data_vg': 'ceph-ffaf3874-fb75-58cf-9cbc-48a6d8d7ea6e'}) 2025-09-23 19:22:59.806956 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5', 'data_vg': 'ceph-ecd11808-f35b-5e5a-be1d-5423ee6ce3c5'}) 2025-09-23 19:22:59.806973 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-1c8984fd-f811-541c-8648-d34ada8a5304', 'data_vg': 'ceph-1c8984fd-f811-541c-8648-d34ada8a5304'}) 2025-09-23 19:22:59.806981 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-ad3a695b-9edf-562e-89c9-18fadd13d262', 'data_vg': 'ceph-ad3a695b-9edf-562e-89c9-18fadd13d262'}) 2025-09-23 19:22:59.806987 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-8028f60e-1a44-5536-9db2-40f94e230aee', 'data_vg': 'ceph-8028f60e-1a44-5536-9db2-40f94e230aee'}) 2025-09-23 19:22:59.806994 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-a2ccb3fa-3e8c-5172-95cb-7cae39233d42', 'data_vg': 'ceph-a2ccb3fa-3e8c-5172-95cb-7cae39233d42'}) 2025-09-23 19:22:59.807001 | orchestrator | 2025-09-23 19:22:59.807009 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm-batch.yml] ************************ 2025-09-23 19:22:59.807025 | orchestrator | Tuesday 23 September 2025 19:19:45 +0000 (0:00:41.556) 0:07:31.162 ***** 2025-09-23 19:22:59.807032 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807039 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807046 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807053 | orchestrator | 2025-09-23 19:22:59.807060 | orchestrator | TASK [ceph-osd : Include_tasks start_osds.yml] ********************************* 2025-09-23 19:22:59.807067 | orchestrator | Tuesday 23 September 2025 19:19:46 +0000 (0:00:00.543) 0:07:31.706 ***** 2025-09-23 19:22:59.807074 | orchestrator | included: /ansible/roles/ceph-osd/tasks/start_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.807097 | orchestrator | 2025-09-23 19:22:59.807105 | orchestrator | TASK [ceph-osd : Get osd ids] ************************************************** 2025-09-23 19:22:59.807112 | orchestrator | Tuesday 23 September 2025 19:19:46 +0000 (0:00:00.512) 0:07:32.219 ***** 2025-09-23 19:22:59.807119 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.807126 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.807134 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.807142 | orchestrator | 2025-09-23 19:22:59.807147 | orchestrator | TASK [ceph-osd : Collect osd ids] ********************************************** 2025-09-23 19:22:59.807152 | orchestrator | Tuesday 23 September 2025 19:19:47 +0000 (0:00:00.649) 0:07:32.868 ***** 2025-09-23 19:22:59.807156 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.807161 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.807165 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.807170 | orchestrator | 2025-09-23 19:22:59.807174 | orchestrator | TASK [ceph-osd : Include_tasks systemd.yml] ************************************ 2025-09-23 19:22:59.807179 | orchestrator | Tuesday 23 September 2025 19:19:50 +0000 (0:00:03.001) 0:07:35.870 ***** 2025-09-23 19:22:59.807183 | orchestrator | included: /ansible/roles/ceph-osd/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.807188 | orchestrator | 2025-09-23 19:22:59.807193 | orchestrator | TASK [ceph-osd : Generate systemd unit file] *********************************** 2025-09-23 19:22:59.807197 | orchestrator | Tuesday 23 September 2025 19:19:50 +0000 (0:00:00.561) 0:07:36.431 ***** 2025-09-23 19:22:59.807202 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.807206 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.807211 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.807215 | orchestrator | 2025-09-23 19:22:59.807220 | orchestrator | TASK [ceph-osd : Generate systemd ceph-osd target file] ************************ 2025-09-23 19:22:59.807227 | orchestrator | Tuesday 23 September 2025 19:19:52 +0000 (0:00:01.248) 0:07:37.680 ***** 2025-09-23 19:22:59.807234 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.807241 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.807248 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.807255 | orchestrator | 2025-09-23 19:22:59.807262 | orchestrator | TASK [ceph-osd : Enable ceph-osd.target] *************************************** 2025-09-23 19:22:59.807269 | orchestrator | Tuesday 23 September 2025 19:19:53 +0000 (0:00:01.550) 0:07:39.230 ***** 2025-09-23 19:22:59.807276 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.807284 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.807295 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.807303 | orchestrator | 2025-09-23 19:22:59.807310 | orchestrator | TASK [ceph-osd : Ensure systemd service override directory exists] ************* 2025-09-23 19:22:59.807318 | orchestrator | Tuesday 23 September 2025 19:19:55 +0000 (0:00:01.749) 0:07:40.980 ***** 2025-09-23 19:22:59.807326 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807333 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807339 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807344 | orchestrator | 2025-09-23 19:22:59.807348 | orchestrator | TASK [ceph-osd : Add ceph-osd systemd service overrides] *********************** 2025-09-23 19:22:59.807353 | orchestrator | Tuesday 23 September 2025 19:19:55 +0000 (0:00:00.349) 0:07:41.329 ***** 2025-09-23 19:22:59.807357 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807362 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807381 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807388 | orchestrator | 2025-09-23 19:22:59.807395 | orchestrator | TASK [ceph-osd : Ensure /var/lib/ceph/osd/- is present] ********* 2025-09-23 19:22:59.807403 | orchestrator | Tuesday 23 September 2025 19:19:56 +0000 (0:00:00.334) 0:07:41.664 ***** 2025-09-23 19:22:59.807411 | orchestrator | ok: [testbed-node-3] => (item=3) 2025-09-23 19:22:59.807418 | orchestrator | ok: [testbed-node-4] => (item=5) 2025-09-23 19:22:59.807425 | orchestrator | ok: [testbed-node-5] => (item=4) 2025-09-23 19:22:59.807433 | orchestrator | ok: [testbed-node-4] => (item=2) 2025-09-23 19:22:59.807440 | orchestrator | ok: [testbed-node-3] => (item=1) 2025-09-23 19:22:59.807447 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-23 19:22:59.807454 | orchestrator | 2025-09-23 19:22:59.807459 | orchestrator | TASK [ceph-osd : Write run file in /var/lib/ceph/osd/xxxx/run] ***************** 2025-09-23 19:22:59.807467 | orchestrator | Tuesday 23 September 2025 19:19:57 +0000 (0:00:01.352) 0:07:43.016 ***** 2025-09-23 19:22:59.807475 | orchestrator | changed: [testbed-node-3] => (item=3) 2025-09-23 19:22:59.807482 | orchestrator | changed: [testbed-node-4] => (item=5) 2025-09-23 19:22:59.807489 | orchestrator | changed: [testbed-node-5] => (item=4) 2025-09-23 19:22:59.807497 | orchestrator | changed: [testbed-node-4] => (item=2) 2025-09-23 19:22:59.807511 | orchestrator | changed: [testbed-node-5] => (item=0) 2025-09-23 19:22:59.807523 | orchestrator | changed: [testbed-node-3] => (item=1) 2025-09-23 19:22:59.807531 | orchestrator | 2025-09-23 19:22:59.807539 | orchestrator | TASK [ceph-osd : Systemd start osd] ******************************************** 2025-09-23 19:22:59.807547 | orchestrator | Tuesday 23 September 2025 19:19:59 +0000 (0:00:02.077) 0:07:45.093 ***** 2025-09-23 19:22:59.807553 | orchestrator | changed: [testbed-node-4] => (item=5) 2025-09-23 19:22:59.807561 | orchestrator | changed: [testbed-node-3] => (item=3) 2025-09-23 19:22:59.807568 | orchestrator | changed: [testbed-node-5] => (item=4) 2025-09-23 19:22:59.807575 | orchestrator | changed: [testbed-node-3] => (item=1) 2025-09-23 19:22:59.807583 | orchestrator | changed: [testbed-node-4] => (item=2) 2025-09-23 19:22:59.807590 | orchestrator | changed: [testbed-node-5] => (item=0) 2025-09-23 19:22:59.807598 | orchestrator | 2025-09-23 19:22:59.807606 | orchestrator | TASK [ceph-osd : Unset noup flag] ********************************************** 2025-09-23 19:22:59.807613 | orchestrator | Tuesday 23 September 2025 19:20:02 +0000 (0:00:03.354) 0:07:48.448 ***** 2025-09-23 19:22:59.807621 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807628 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807635 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.807643 | orchestrator | 2025-09-23 19:22:59.807650 | orchestrator | TASK [ceph-osd : Wait for all osd to be up] ************************************ 2025-09-23 19:22:59.807658 | orchestrator | Tuesday 23 September 2025 19:20:06 +0000 (0:00:03.198) 0:07:51.646 ***** 2025-09-23 19:22:59.807665 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807673 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807680 | orchestrator | FAILED - RETRYING: [testbed-node-5 -> testbed-node-0]: Wait for all osd to be up (60 retries left). 2025-09-23 19:22:59.807684 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.807689 | orchestrator | 2025-09-23 19:22:59.807693 | orchestrator | TASK [ceph-osd : Include crush_rules.yml] ************************************** 2025-09-23 19:22:59.807698 | orchestrator | Tuesday 23 September 2025 19:20:18 +0000 (0:00:12.668) 0:08:04.314 ***** 2025-09-23 19:22:59.807702 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807707 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807711 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807715 | orchestrator | 2025-09-23 19:22:59.807720 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-23 19:22:59.807724 | orchestrator | Tuesday 23 September 2025 19:20:19 +0000 (0:00:00.827) 0:08:05.142 ***** 2025-09-23 19:22:59.807729 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807738 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807742 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807746 | orchestrator | 2025-09-23 19:22:59.807751 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2025-09-23 19:22:59.807755 | orchestrator | Tuesday 23 September 2025 19:20:20 +0000 (0:00:00.538) 0:08:05.680 ***** 2025-09-23 19:22:59.807760 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.807764 | orchestrator | 2025-09-23 19:22:59.807769 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2025-09-23 19:22:59.807773 | orchestrator | Tuesday 23 September 2025 19:20:20 +0000 (0:00:00.517) 0:08:06.198 ***** 2025-09-23 19:22:59.807778 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.807782 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.807787 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.807791 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807796 | orchestrator | 2025-09-23 19:22:59.807800 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2025-09-23 19:22:59.807805 | orchestrator | Tuesday 23 September 2025 19:20:20 +0000 (0:00:00.376) 0:08:06.574 ***** 2025-09-23 19:22:59.807809 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807813 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807818 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807822 | orchestrator | 2025-09-23 19:22:59.807827 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2025-09-23 19:22:59.807831 | orchestrator | Tuesday 23 September 2025 19:20:21 +0000 (0:00:00.316) 0:08:06.891 ***** 2025-09-23 19:22:59.807836 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807840 | orchestrator | 2025-09-23 19:22:59.807845 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2025-09-23 19:22:59.807849 | orchestrator | Tuesday 23 September 2025 19:20:21 +0000 (0:00:00.215) 0:08:07.107 ***** 2025-09-23 19:22:59.807854 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807858 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.807863 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.807867 | orchestrator | 2025-09-23 19:22:59.807872 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2025-09-23 19:22:59.807876 | orchestrator | Tuesday 23 September 2025 19:20:22 +0000 (0:00:00.537) 0:08:07.644 ***** 2025-09-23 19:22:59.807880 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807885 | orchestrator | 2025-09-23 19:22:59.807889 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2025-09-23 19:22:59.807894 | orchestrator | Tuesday 23 September 2025 19:20:22 +0000 (0:00:00.249) 0:08:07.893 ***** 2025-09-23 19:22:59.807898 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807903 | orchestrator | 2025-09-23 19:22:59.807907 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2025-09-23 19:22:59.807912 | orchestrator | Tuesday 23 September 2025 19:20:22 +0000 (0:00:00.214) 0:08:08.108 ***** 2025-09-23 19:22:59.807916 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807921 | orchestrator | 2025-09-23 19:22:59.807925 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2025-09-23 19:22:59.807930 | orchestrator | Tuesday 23 September 2025 19:20:22 +0000 (0:00:00.131) 0:08:08.239 ***** 2025-09-23 19:22:59.807934 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807939 | orchestrator | 2025-09-23 19:22:59.807946 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2025-09-23 19:22:59.807954 | orchestrator | Tuesday 23 September 2025 19:20:22 +0000 (0:00:00.248) 0:08:08.487 ***** 2025-09-23 19:22:59.807959 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807963 | orchestrator | 2025-09-23 19:22:59.807968 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2025-09-23 19:22:59.807976 | orchestrator | Tuesday 23 September 2025 19:20:23 +0000 (0:00:00.219) 0:08:08.707 ***** 2025-09-23 19:22:59.807981 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.807985 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.807990 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.807994 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.807998 | orchestrator | 2025-09-23 19:22:59.808003 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2025-09-23 19:22:59.808007 | orchestrator | Tuesday 23 September 2025 19:20:23 +0000 (0:00:00.387) 0:08:09.095 ***** 2025-09-23 19:22:59.808012 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808016 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808021 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808025 | orchestrator | 2025-09-23 19:22:59.808031 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2025-09-23 19:22:59.808039 | orchestrator | Tuesday 23 September 2025 19:20:23 +0000 (0:00:00.297) 0:08:09.392 ***** 2025-09-23 19:22:59.808046 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808053 | orchestrator | 2025-09-23 19:22:59.808063 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2025-09-23 19:22:59.808071 | orchestrator | Tuesday 23 September 2025 19:20:24 +0000 (0:00:00.760) 0:08:10.152 ***** 2025-09-23 19:22:59.808079 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808192 | orchestrator | 2025-09-23 19:22:59.808201 | orchestrator | PLAY [Apply role ceph-crash] *************************************************** 2025-09-23 19:22:59.808206 | orchestrator | 2025-09-23 19:22:59.808210 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.808214 | orchestrator | Tuesday 23 September 2025 19:20:25 +0000 (0:00:00.669) 0:08:10.822 ***** 2025-09-23 19:22:59.808219 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.808224 | orchestrator | 2025-09-23 19:22:59.808228 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.808232 | orchestrator | Tuesday 23 September 2025 19:20:26 +0000 (0:00:01.187) 0:08:12.010 ***** 2025-09-23 19:22:59.808236 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.808241 | orchestrator | 2025-09-23 19:22:59.808245 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.808249 | orchestrator | Tuesday 23 September 2025 19:20:27 +0000 (0:00:01.166) 0:08:13.176 ***** 2025-09-23 19:22:59.808253 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808257 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808261 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808265 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808269 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808273 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808277 | orchestrator | 2025-09-23 19:22:59.808281 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.808285 | orchestrator | Tuesday 23 September 2025 19:20:28 +0000 (0:00:01.218) 0:08:14.395 ***** 2025-09-23 19:22:59.808290 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808294 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808298 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808302 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808306 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808310 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808314 | orchestrator | 2025-09-23 19:22:59.808318 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.808322 | orchestrator | Tuesday 23 September 2025 19:20:29 +0000 (0:00:00.738) 0:08:15.133 ***** 2025-09-23 19:22:59.808332 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808336 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808341 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808345 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808349 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808353 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808357 | orchestrator | 2025-09-23 19:22:59.808361 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.808365 | orchestrator | Tuesday 23 September 2025 19:20:30 +0000 (0:00:00.940) 0:08:16.074 ***** 2025-09-23 19:22:59.808369 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808373 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808377 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808381 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808385 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808389 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808393 | orchestrator | 2025-09-23 19:22:59.808397 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.808401 | orchestrator | Tuesday 23 September 2025 19:20:31 +0000 (0:00:00.735) 0:08:16.810 ***** 2025-09-23 19:22:59.808405 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808409 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808413 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808417 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808421 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808425 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808430 | orchestrator | 2025-09-23 19:22:59.808434 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.808438 | orchestrator | Tuesday 23 September 2025 19:20:32 +0000 (0:00:00.942) 0:08:17.752 ***** 2025-09-23 19:22:59.808442 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808446 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808456 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808464 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808468 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808472 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808476 | orchestrator | 2025-09-23 19:22:59.808481 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.808485 | orchestrator | Tuesday 23 September 2025 19:20:32 +0000 (0:00:00.703) 0:08:18.456 ***** 2025-09-23 19:22:59.808489 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808493 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808497 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808501 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808505 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808509 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808521 | orchestrator | 2025-09-23 19:22:59.808525 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.808529 | orchestrator | Tuesday 23 September 2025 19:20:33 +0000 (0:00:00.519) 0:08:18.975 ***** 2025-09-23 19:22:59.808534 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808538 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808542 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808546 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808556 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808560 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808564 | orchestrator | 2025-09-23 19:22:59.808568 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.808572 | orchestrator | Tuesday 23 September 2025 19:20:34 +0000 (0:00:01.144) 0:08:20.120 ***** 2025-09-23 19:22:59.808576 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808580 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808584 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808588 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808592 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808599 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808604 | orchestrator | 2025-09-23 19:22:59.808608 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.808612 | orchestrator | Tuesday 23 September 2025 19:20:35 +0000 (0:00:00.947) 0:08:21.067 ***** 2025-09-23 19:22:59.808616 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808620 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808624 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808628 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808632 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808636 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808640 | orchestrator | 2025-09-23 19:22:59.808644 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.808648 | orchestrator | Tuesday 23 September 2025 19:20:36 +0000 (0:00:00.695) 0:08:21.762 ***** 2025-09-23 19:22:59.808652 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808656 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808660 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808665 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808669 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808673 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808677 | orchestrator | 2025-09-23 19:22:59.808681 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.808685 | orchestrator | Tuesday 23 September 2025 19:20:36 +0000 (0:00:00.523) 0:08:22.286 ***** 2025-09-23 19:22:59.808689 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808693 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808697 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808701 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808705 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808709 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808713 | orchestrator | 2025-09-23 19:22:59.808717 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.808721 | orchestrator | Tuesday 23 September 2025 19:20:37 +0000 (0:00:00.678) 0:08:22.965 ***** 2025-09-23 19:22:59.808726 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808730 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808734 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808738 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808742 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808746 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808750 | orchestrator | 2025-09-23 19:22:59.808754 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.808758 | orchestrator | Tuesday 23 September 2025 19:20:37 +0000 (0:00:00.511) 0:08:23.477 ***** 2025-09-23 19:22:59.808762 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808766 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808770 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808774 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808778 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808782 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808786 | orchestrator | 2025-09-23 19:22:59.808790 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.808794 | orchestrator | Tuesday 23 September 2025 19:20:38 +0000 (0:00:00.677) 0:08:24.154 ***** 2025-09-23 19:22:59.808798 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808803 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808807 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808811 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808815 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808819 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808823 | orchestrator | 2025-09-23 19:22:59.808827 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.808834 | orchestrator | Tuesday 23 September 2025 19:20:39 +0000 (0:00:00.627) 0:08:24.782 ***** 2025-09-23 19:22:59.808838 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808842 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808846 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808851 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:22:59.808855 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:22:59.808859 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:22:59.808863 | orchestrator | 2025-09-23 19:22:59.808867 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.808871 | orchestrator | Tuesday 23 September 2025 19:20:39 +0000 (0:00:00.689) 0:08:25.471 ***** 2025-09-23 19:22:59.808878 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.808884 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.808889 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.808893 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808897 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808901 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808905 | orchestrator | 2025-09-23 19:22:59.808909 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.808913 | orchestrator | Tuesday 23 September 2025 19:20:40 +0000 (0:00:00.560) 0:08:26.032 ***** 2025-09-23 19:22:59.808917 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808922 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808926 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808930 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808934 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808938 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808942 | orchestrator | 2025-09-23 19:22:59.808946 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.808950 | orchestrator | Tuesday 23 September 2025 19:20:41 +0000 (0:00:00.831) 0:08:26.863 ***** 2025-09-23 19:22:59.808954 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.808958 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.808962 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.808966 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.808970 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.808974 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.808978 | orchestrator | 2025-09-23 19:22:59.808982 | orchestrator | TASK [ceph-crash : Create client.crash keyring] ******************************** 2025-09-23 19:22:59.808986 | orchestrator | Tuesday 23 September 2025 19:20:42 +0000 (0:00:01.232) 0:08:28.096 ***** 2025-09-23 19:22:59.808990 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.808995 | orchestrator | 2025-09-23 19:22:59.808999 | orchestrator | TASK [ceph-crash : Get keys from monitors] ************************************* 2025-09-23 19:22:59.809003 | orchestrator | Tuesday 23 September 2025 19:20:46 +0000 (0:00:03.914) 0:08:32.010 ***** 2025-09-23 19:22:59.809007 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.809011 | orchestrator | 2025-09-23 19:22:59.809015 | orchestrator | TASK [ceph-crash : Copy ceph key(s) if needed] ********************************* 2025-09-23 19:22:59.809019 | orchestrator | Tuesday 23 September 2025 19:20:48 +0000 (0:00:01.978) 0:08:33.988 ***** 2025-09-23 19:22:59.809023 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.809027 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.809031 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.809035 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.809039 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.809043 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.809047 | orchestrator | 2025-09-23 19:22:59.809051 | orchestrator | TASK [ceph-crash : Create /var/lib/ceph/crash/posted] ************************** 2025-09-23 19:22:59.809056 | orchestrator | Tuesday 23 September 2025 19:20:49 +0000 (0:00:01.504) 0:08:35.493 ***** 2025-09-23 19:22:59.809060 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.809064 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.809071 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.809075 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.809079 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.809100 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.809104 | orchestrator | 2025-09-23 19:22:59.809108 | orchestrator | TASK [ceph-crash : Include_tasks systemd.yml] ********************************** 2025-09-23 19:22:59.809112 | orchestrator | Tuesday 23 September 2025 19:20:51 +0000 (0:00:01.276) 0:08:36.769 ***** 2025-09-23 19:22:59.809117 | orchestrator | included: /ansible/roles/ceph-crash/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.809122 | orchestrator | 2025-09-23 19:22:59.809126 | orchestrator | TASK [ceph-crash : Generate systemd unit file for ceph-crash container] ******** 2025-09-23 19:22:59.809130 | orchestrator | Tuesday 23 September 2025 19:20:52 +0000 (0:00:01.208) 0:08:37.978 ***** 2025-09-23 19:22:59.809134 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.809138 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.809142 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.809146 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.809150 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.809154 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.809158 | orchestrator | 2025-09-23 19:22:59.809163 | orchestrator | TASK [ceph-crash : Start the ceph-crash service] ******************************* 2025-09-23 19:22:59.809167 | orchestrator | Tuesday 23 September 2025 19:20:53 +0000 (0:00:01.499) 0:08:39.478 ***** 2025-09-23 19:22:59.809171 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.809175 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.809179 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.809183 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.809187 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.809191 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.809195 | orchestrator | 2025-09-23 19:22:59.809199 | orchestrator | RUNNING HANDLER [ceph-handler : Ceph crash handler] **************************** 2025-09-23 19:22:59.809203 | orchestrator | Tuesday 23 September 2025 19:20:57 +0000 (0:00:03.498) 0:08:42.977 ***** 2025-09-23 19:22:59.809208 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_crash.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:22:59.809212 | orchestrator | 2025-09-23 19:22:59.809216 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called before restart] ****** 2025-09-23 19:22:59.809220 | orchestrator | Tuesday 23 September 2025 19:20:58 +0000 (0:00:01.181) 0:08:44.158 ***** 2025-09-23 19:22:59.809224 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809228 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809232 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809236 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.809240 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.809244 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.809249 | orchestrator | 2025-09-23 19:22:59.809253 | orchestrator | RUNNING HANDLER [ceph-handler : Restart the ceph-crash service] **************** 2025-09-23 19:22:59.809257 | orchestrator | Tuesday 23 September 2025 19:20:59 +0000 (0:00:00.617) 0:08:44.776 ***** 2025-09-23 19:22:59.809264 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.809271 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.809275 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.809279 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:22:59.809283 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:22:59.809287 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:22:59.809291 | orchestrator | 2025-09-23 19:22:59.809296 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called after restart] ******* 2025-09-23 19:22:59.809300 | orchestrator | Tuesday 23 September 2025 19:21:01 +0000 (0:00:02.506) 0:08:47.283 ***** 2025-09-23 19:22:59.809304 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809308 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809312 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809319 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:22:59.809323 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:22:59.809327 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:22:59.809331 | orchestrator | 2025-09-23 19:22:59.809335 | orchestrator | PLAY [Apply role ceph-mds] ***************************************************** 2025-09-23 19:22:59.809339 | orchestrator | 2025-09-23 19:22:59.809343 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.809348 | orchestrator | Tuesday 23 September 2025 19:21:02 +0000 (0:00:00.812) 0:08:48.095 ***** 2025-09-23 19:22:59.809352 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.809356 | orchestrator | 2025-09-23 19:22:59.809360 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.809364 | orchestrator | Tuesday 23 September 2025 19:21:03 +0000 (0:00:00.736) 0:08:48.831 ***** 2025-09-23 19:22:59.809369 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.809373 | orchestrator | 2025-09-23 19:22:59.809377 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.809381 | orchestrator | Tuesday 23 September 2025 19:21:03 +0000 (0:00:00.534) 0:08:49.366 ***** 2025-09-23 19:22:59.809385 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809389 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809393 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809397 | orchestrator | 2025-09-23 19:22:59.809401 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.809405 | orchestrator | Tuesday 23 September 2025 19:21:04 +0000 (0:00:00.518) 0:08:49.884 ***** 2025-09-23 19:22:59.809409 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809413 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809417 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809422 | orchestrator | 2025-09-23 19:22:59.809426 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.809430 | orchestrator | Tuesday 23 September 2025 19:21:04 +0000 (0:00:00.711) 0:08:50.596 ***** 2025-09-23 19:22:59.809434 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809438 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809442 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809446 | orchestrator | 2025-09-23 19:22:59.809450 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.809454 | orchestrator | Tuesday 23 September 2025 19:21:05 +0000 (0:00:00.751) 0:08:51.348 ***** 2025-09-23 19:22:59.809458 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809462 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809466 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809470 | orchestrator | 2025-09-23 19:22:59.809475 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.809479 | orchestrator | Tuesday 23 September 2025 19:21:06 +0000 (0:00:00.738) 0:08:52.086 ***** 2025-09-23 19:22:59.809483 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809487 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809491 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809495 | orchestrator | 2025-09-23 19:22:59.809499 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.809503 | orchestrator | Tuesday 23 September 2025 19:21:07 +0000 (0:00:00.529) 0:08:52.616 ***** 2025-09-23 19:22:59.809507 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809511 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809515 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809519 | orchestrator | 2025-09-23 19:22:59.809524 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.809528 | orchestrator | Tuesday 23 September 2025 19:21:07 +0000 (0:00:00.310) 0:08:52.926 ***** 2025-09-23 19:22:59.809535 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809539 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809543 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809547 | orchestrator | 2025-09-23 19:22:59.809551 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.809556 | orchestrator | Tuesday 23 September 2025 19:21:07 +0000 (0:00:00.301) 0:08:53.227 ***** 2025-09-23 19:22:59.809560 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809564 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809568 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809572 | orchestrator | 2025-09-23 19:22:59.809576 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.809580 | orchestrator | Tuesday 23 September 2025 19:21:08 +0000 (0:00:00.770) 0:08:53.997 ***** 2025-09-23 19:22:59.809584 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809588 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809592 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809596 | orchestrator | 2025-09-23 19:22:59.809600 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.809604 | orchestrator | Tuesday 23 September 2025 19:21:09 +0000 (0:00:01.020) 0:08:55.018 ***** 2025-09-23 19:22:59.809609 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809613 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809617 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809621 | orchestrator | 2025-09-23 19:22:59.809625 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.809629 | orchestrator | Tuesday 23 September 2025 19:21:09 +0000 (0:00:00.328) 0:08:55.346 ***** 2025-09-23 19:22:59.809635 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809642 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809647 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809651 | orchestrator | 2025-09-23 19:22:59.809655 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.809659 | orchestrator | Tuesday 23 September 2025 19:21:10 +0000 (0:00:00.298) 0:08:55.645 ***** 2025-09-23 19:22:59.809663 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809667 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809671 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809675 | orchestrator | 2025-09-23 19:22:59.809679 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.809683 | orchestrator | Tuesday 23 September 2025 19:21:10 +0000 (0:00:00.386) 0:08:56.031 ***** 2025-09-23 19:22:59.809688 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809692 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809696 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809700 | orchestrator | 2025-09-23 19:22:59.809704 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.809708 | orchestrator | Tuesday 23 September 2025 19:21:11 +0000 (0:00:00.621) 0:08:56.653 ***** 2025-09-23 19:22:59.809712 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809716 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809720 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809724 | orchestrator | 2025-09-23 19:22:59.809728 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.809732 | orchestrator | Tuesday 23 September 2025 19:21:11 +0000 (0:00:00.284) 0:08:56.937 ***** 2025-09-23 19:22:59.809736 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809740 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809744 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809748 | orchestrator | 2025-09-23 19:22:59.809752 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.809757 | orchestrator | Tuesday 23 September 2025 19:21:11 +0000 (0:00:00.278) 0:08:57.216 ***** 2025-09-23 19:22:59.809761 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809765 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809774 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809778 | orchestrator | 2025-09-23 19:22:59.809782 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.809786 | orchestrator | Tuesday 23 September 2025 19:21:11 +0000 (0:00:00.268) 0:08:57.484 ***** 2025-09-23 19:22:59.809790 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809794 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809799 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809803 | orchestrator | 2025-09-23 19:22:59.809807 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.809811 | orchestrator | Tuesday 23 September 2025 19:21:12 +0000 (0:00:00.430) 0:08:57.914 ***** 2025-09-23 19:22:59.809815 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809819 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809823 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809827 | orchestrator | 2025-09-23 19:22:59.809831 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.809835 | orchestrator | Tuesday 23 September 2025 19:21:12 +0000 (0:00:00.269) 0:08:58.184 ***** 2025-09-23 19:22:59.809839 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.809843 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.809847 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.809852 | orchestrator | 2025-09-23 19:22:59.809856 | orchestrator | TASK [ceph-mds : Include create_mds_filesystems.yml] *************************** 2025-09-23 19:22:59.809860 | orchestrator | Tuesday 23 September 2025 19:21:13 +0000 (0:00:00.473) 0:08:58.657 ***** 2025-09-23 19:22:59.809864 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.809868 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.809872 | orchestrator | included: /ansible/roles/ceph-mds/tasks/create_mds_filesystems.yml for testbed-node-3 2025-09-23 19:22:59.809876 | orchestrator | 2025-09-23 19:22:59.809880 | orchestrator | TASK [ceph-facts : Get current default crush rule details] ********************* 2025-09-23 19:22:59.809884 | orchestrator | Tuesday 23 September 2025 19:21:13 +0000 (0:00:00.526) 0:08:59.183 ***** 2025-09-23 19:22:59.809888 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.809892 | orchestrator | 2025-09-23 19:22:59.809896 | orchestrator | TASK [ceph-facts : Get current default crush rule name] ************************ 2025-09-23 19:22:59.809901 | orchestrator | Tuesday 23 September 2025 19:21:15 +0000 (0:00:02.206) 0:09:01.390 ***** 2025-09-23 19:22:59.809906 | orchestrator | skipping: [testbed-node-3] => (item={'rule_id': 0, 'rule_name': 'replicated_rule', 'type': 1, 'steps': [{'op': 'take', 'item': -1, 'item_name': 'default'}, {'op': 'chooseleaf_firstn', 'num': 0, 'type': 'host'}, {'op': 'emit'}]})  2025-09-23 19:22:59.809912 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.809916 | orchestrator | 2025-09-23 19:22:59.809920 | orchestrator | TASK [ceph-mds : Create filesystem pools] ************************************** 2025-09-23 19:22:59.809924 | orchestrator | Tuesday 23 September 2025 19:21:15 +0000 (0:00:00.220) 0:09:01.611 ***** 2025-09-23 19:22:59.809930 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_data', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:22:59.809939 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_metadata', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:22:59.809943 | orchestrator | 2025-09-23 19:22:59.809947 | orchestrator | TASK [ceph-mds : Create ceph filesystem] *************************************** 2025-09-23 19:22:59.809954 | orchestrator | Tuesday 23 September 2025 19:21:24 +0000 (0:00:08.236) 0:09:09.847 ***** 2025-09-23 19:22:59.809961 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-23 19:22:59.809965 | orchestrator | 2025-09-23 19:22:59.809969 | orchestrator | TASK [ceph-mds : Include common.yml] ******************************************* 2025-09-23 19:22:59.809976 | orchestrator | Tuesday 23 September 2025 19:21:27 +0000 (0:00:03.724) 0:09:13.572 ***** 2025-09-23 19:22:59.809981 | orchestrator | included: /ansible/roles/ceph-mds/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.809985 | orchestrator | 2025-09-23 19:22:59.809989 | orchestrator | TASK [ceph-mds : Create bootstrap-mds and mds directories] ********************* 2025-09-23 19:22:59.809993 | orchestrator | Tuesday 23 September 2025 19:21:28 +0000 (0:00:00.744) 0:09:14.316 ***** 2025-09-23 19:22:59.809997 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds/) 2025-09-23 19:22:59.810001 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds/) 2025-09-23 19:22:59.810005 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds/) 2025-09-23 19:22:59.810009 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds/ceph-testbed-node-3) 2025-09-23 19:22:59.810034 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds/ceph-testbed-node-4) 2025-09-23 19:22:59.810038 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds/ceph-testbed-node-5) 2025-09-23 19:22:59.810042 | orchestrator | 2025-09-23 19:22:59.810046 | orchestrator | TASK [ceph-mds : Get keys from monitors] *************************************** 2025-09-23 19:22:59.810050 | orchestrator | Tuesday 23 September 2025 19:21:29 +0000 (0:00:01.047) 0:09:15.364 ***** 2025-09-23 19:22:59.810054 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.810058 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-23 19:22:59.810062 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:22:59.810066 | orchestrator | 2025-09-23 19:22:59.810070 | orchestrator | TASK [ceph-mds : Copy ceph key(s) if needed] *********************************** 2025-09-23 19:22:59.810075 | orchestrator | Tuesday 23 September 2025 19:21:31 +0000 (0:00:02.193) 0:09:17.557 ***** 2025-09-23 19:22:59.810079 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-23 19:22:59.810091 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-23 19:22:59.810096 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810100 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-23 19:22:59.810104 | orchestrator | skipping: [testbed-node-5] => (item=None)  2025-09-23 19:22:59.810108 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810112 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-23 19:22:59.810116 | orchestrator | skipping: [testbed-node-4] => (item=None)  2025-09-23 19:22:59.810120 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810124 | orchestrator | 2025-09-23 19:22:59.810129 | orchestrator | TASK [ceph-mds : Create mds keyring] ******************************************* 2025-09-23 19:22:59.810133 | orchestrator | Tuesday 23 September 2025 19:21:33 +0000 (0:00:01.205) 0:09:18.763 ***** 2025-09-23 19:22:59.810137 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810141 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810145 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810149 | orchestrator | 2025-09-23 19:22:59.810153 | orchestrator | TASK [ceph-mds : Non_containerized.yml] **************************************** 2025-09-23 19:22:59.810157 | orchestrator | Tuesday 23 September 2025 19:21:36 +0000 (0:00:03.034) 0:09:21.798 ***** 2025-09-23 19:22:59.810161 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810166 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810170 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810174 | orchestrator | 2025-09-23 19:22:59.810178 | orchestrator | TASK [ceph-mds : Containerized.yml] ******************************************** 2025-09-23 19:22:59.810182 | orchestrator | Tuesday 23 September 2025 19:21:36 +0000 (0:00:00.319) 0:09:22.118 ***** 2025-09-23 19:22:59.810186 | orchestrator | included: /ansible/roles/ceph-mds/tasks/containerized.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.810190 | orchestrator | 2025-09-23 19:22:59.810194 | orchestrator | TASK [ceph-mds : Include_tasks systemd.yml] ************************************ 2025-09-23 19:22:59.810202 | orchestrator | Tuesday 23 September 2025 19:21:37 +0000 (0:00:00.516) 0:09:22.634 ***** 2025-09-23 19:22:59.810206 | orchestrator | included: /ansible/roles/ceph-mds/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.810210 | orchestrator | 2025-09-23 19:22:59.810214 | orchestrator | TASK [ceph-mds : Generate systemd unit file] *********************************** 2025-09-23 19:22:59.810218 | orchestrator | Tuesday 23 September 2025 19:21:37 +0000 (0:00:00.709) 0:09:23.344 ***** 2025-09-23 19:22:59.810222 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810226 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810230 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810235 | orchestrator | 2025-09-23 19:22:59.810239 | orchestrator | TASK [ceph-mds : Generate systemd ceph-mds target file] ************************ 2025-09-23 19:22:59.810243 | orchestrator | Tuesday 23 September 2025 19:21:39 +0000 (0:00:01.279) 0:09:24.623 ***** 2025-09-23 19:22:59.810247 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810251 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810255 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810259 | orchestrator | 2025-09-23 19:22:59.810263 | orchestrator | TASK [ceph-mds : Enable ceph-mds.target] *************************************** 2025-09-23 19:22:59.810267 | orchestrator | Tuesday 23 September 2025 19:21:40 +0000 (0:00:01.167) 0:09:25.791 ***** 2025-09-23 19:22:59.810271 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810275 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810280 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810284 | orchestrator | 2025-09-23 19:22:59.810288 | orchestrator | TASK [ceph-mds : Systemd start mds container] ********************************** 2025-09-23 19:22:59.810292 | orchestrator | Tuesday 23 September 2025 19:21:41 +0000 (0:00:01.730) 0:09:27.522 ***** 2025-09-23 19:22:59.810296 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810303 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810310 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810315 | orchestrator | 2025-09-23 19:22:59.810319 | orchestrator | TASK [ceph-mds : Wait for mds socket to exist] ********************************* 2025-09-23 19:22:59.810323 | orchestrator | Tuesday 23 September 2025 19:21:43 +0000 (0:00:02.061) 0:09:29.583 ***** 2025-09-23 19:22:59.810327 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810331 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810336 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810340 | orchestrator | 2025-09-23 19:22:59.810344 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-23 19:22:59.810348 | orchestrator | Tuesday 23 September 2025 19:21:45 +0000 (0:00:01.117) 0:09:30.700 ***** 2025-09-23 19:22:59.810352 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810356 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810360 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810364 | orchestrator | 2025-09-23 19:22:59.810368 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2025-09-23 19:22:59.810372 | orchestrator | Tuesday 23 September 2025 19:21:45 +0000 (0:00:00.810) 0:09:31.510 ***** 2025-09-23 19:22:59.810377 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.810381 | orchestrator | 2025-09-23 19:22:59.810385 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2025-09-23 19:22:59.810389 | orchestrator | Tuesday 23 September 2025 19:21:46 +0000 (0:00:00.440) 0:09:31.951 ***** 2025-09-23 19:22:59.810393 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810397 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810401 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810405 | orchestrator | 2025-09-23 19:22:59.810410 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2025-09-23 19:22:59.810414 | orchestrator | Tuesday 23 September 2025 19:21:46 +0000 (0:00:00.272) 0:09:32.223 ***** 2025-09-23 19:22:59.810418 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.810426 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.810430 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.810434 | orchestrator | 2025-09-23 19:22:59.810438 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2025-09-23 19:22:59.810442 | orchestrator | Tuesday 23 September 2025 19:21:47 +0000 (0:00:01.329) 0:09:33.553 ***** 2025-09-23 19:22:59.810446 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.810450 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.810454 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.810459 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810463 | orchestrator | 2025-09-23 19:22:59.810467 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2025-09-23 19:22:59.810471 | orchestrator | Tuesday 23 September 2025 19:21:48 +0000 (0:00:00.597) 0:09:34.150 ***** 2025-09-23 19:22:59.810475 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810479 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810483 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810487 | orchestrator | 2025-09-23 19:22:59.810492 | orchestrator | PLAY [Apply role ceph-rgw] ***************************************************** 2025-09-23 19:22:59.810496 | orchestrator | 2025-09-23 19:22:59.810500 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-23 19:22:59.810504 | orchestrator | Tuesday 23 September 2025 19:21:49 +0000 (0:00:00.517) 0:09:34.668 ***** 2025-09-23 19:22:59.810508 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.810512 | orchestrator | 2025-09-23 19:22:59.810516 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-23 19:22:59.810521 | orchestrator | Tuesday 23 September 2025 19:21:49 +0000 (0:00:00.717) 0:09:35.386 ***** 2025-09-23 19:22:59.810525 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.810529 | orchestrator | 2025-09-23 19:22:59.810533 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-23 19:22:59.810537 | orchestrator | Tuesday 23 September 2025 19:21:50 +0000 (0:00:00.507) 0:09:35.893 ***** 2025-09-23 19:22:59.810541 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810545 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810549 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810554 | orchestrator | 2025-09-23 19:22:59.810558 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-23 19:22:59.810562 | orchestrator | Tuesday 23 September 2025 19:21:50 +0000 (0:00:00.487) 0:09:36.381 ***** 2025-09-23 19:22:59.810566 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810570 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810574 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810578 | orchestrator | 2025-09-23 19:22:59.810582 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-23 19:22:59.810586 | orchestrator | Tuesday 23 September 2025 19:21:51 +0000 (0:00:00.671) 0:09:37.052 ***** 2025-09-23 19:22:59.810591 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810595 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810599 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810603 | orchestrator | 2025-09-23 19:22:59.810607 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-23 19:22:59.810611 | orchestrator | Tuesday 23 September 2025 19:21:52 +0000 (0:00:00.705) 0:09:37.758 ***** 2025-09-23 19:22:59.810615 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810619 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810623 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810627 | orchestrator | 2025-09-23 19:22:59.810632 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-23 19:22:59.810636 | orchestrator | Tuesday 23 September 2025 19:21:52 +0000 (0:00:00.696) 0:09:38.454 ***** 2025-09-23 19:22:59.810643 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810647 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810651 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810655 | orchestrator | 2025-09-23 19:22:59.810664 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-23 19:22:59.810669 | orchestrator | Tuesday 23 September 2025 19:21:53 +0000 (0:00:00.534) 0:09:38.989 ***** 2025-09-23 19:22:59.810673 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810677 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810681 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810685 | orchestrator | 2025-09-23 19:22:59.810689 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-23 19:22:59.810693 | orchestrator | Tuesday 23 September 2025 19:21:53 +0000 (0:00:00.305) 0:09:39.295 ***** 2025-09-23 19:22:59.810697 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810702 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810706 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810710 | orchestrator | 2025-09-23 19:22:59.810714 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-23 19:22:59.810718 | orchestrator | Tuesday 23 September 2025 19:21:53 +0000 (0:00:00.283) 0:09:39.578 ***** 2025-09-23 19:22:59.810722 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810726 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810730 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810734 | orchestrator | 2025-09-23 19:22:59.810739 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-23 19:22:59.810743 | orchestrator | Tuesday 23 September 2025 19:21:54 +0000 (0:00:00.720) 0:09:40.298 ***** 2025-09-23 19:22:59.810747 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810751 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810755 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810759 | orchestrator | 2025-09-23 19:22:59.810763 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-23 19:22:59.810767 | orchestrator | Tuesday 23 September 2025 19:21:55 +0000 (0:00:00.957) 0:09:41.256 ***** 2025-09-23 19:22:59.810771 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810775 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810780 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810784 | orchestrator | 2025-09-23 19:22:59.810788 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-23 19:22:59.810792 | orchestrator | Tuesday 23 September 2025 19:21:55 +0000 (0:00:00.307) 0:09:41.564 ***** 2025-09-23 19:22:59.810796 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810800 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810804 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810808 | orchestrator | 2025-09-23 19:22:59.810812 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-23 19:22:59.810817 | orchestrator | Tuesday 23 September 2025 19:21:56 +0000 (0:00:00.300) 0:09:41.864 ***** 2025-09-23 19:22:59.810821 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810825 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810829 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810833 | orchestrator | 2025-09-23 19:22:59.810837 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-23 19:22:59.810841 | orchestrator | Tuesday 23 September 2025 19:21:56 +0000 (0:00:00.330) 0:09:42.195 ***** 2025-09-23 19:22:59.810845 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810850 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810854 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810858 | orchestrator | 2025-09-23 19:22:59.810862 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-23 19:22:59.810866 | orchestrator | Tuesday 23 September 2025 19:21:57 +0000 (0:00:00.543) 0:09:42.738 ***** 2025-09-23 19:22:59.810870 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810877 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810881 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810886 | orchestrator | 2025-09-23 19:22:59.810890 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-23 19:22:59.810894 | orchestrator | Tuesday 23 September 2025 19:21:57 +0000 (0:00:00.320) 0:09:43.059 ***** 2025-09-23 19:22:59.810898 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810902 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810907 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810911 | orchestrator | 2025-09-23 19:22:59.810915 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-23 19:22:59.810919 | orchestrator | Tuesday 23 September 2025 19:21:57 +0000 (0:00:00.282) 0:09:43.341 ***** 2025-09-23 19:22:59.810923 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810927 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810931 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810935 | orchestrator | 2025-09-23 19:22:59.810939 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-23 19:22:59.810944 | orchestrator | Tuesday 23 September 2025 19:21:58 +0000 (0:00:00.325) 0:09:43.667 ***** 2025-09-23 19:22:59.810948 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.810952 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.810956 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.810960 | orchestrator | 2025-09-23 19:22:59.810964 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-23 19:22:59.810968 | orchestrator | Tuesday 23 September 2025 19:21:58 +0000 (0:00:00.528) 0:09:44.196 ***** 2025-09-23 19:22:59.810972 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.810976 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.810980 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.810985 | orchestrator | 2025-09-23 19:22:59.810989 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-23 19:22:59.810993 | orchestrator | Tuesday 23 September 2025 19:21:58 +0000 (0:00:00.318) 0:09:44.514 ***** 2025-09-23 19:22:59.810997 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.811001 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.811005 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.811009 | orchestrator | 2025-09-23 19:22:59.811013 | orchestrator | TASK [ceph-rgw : Include common.yml] ******************************************* 2025-09-23 19:22:59.811017 | orchestrator | Tuesday 23 September 2025 19:21:59 +0000 (0:00:00.545) 0:09:45.060 ***** 2025-09-23 19:22:59.811022 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.811026 | orchestrator | 2025-09-23 19:22:59.811030 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2025-09-23 19:22:59.811036 | orchestrator | Tuesday 23 September 2025 19:22:00 +0000 (0:00:00.747) 0:09:45.807 ***** 2025-09-23 19:22:59.811041 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811045 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-23 19:22:59.811050 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:22:59.811054 | orchestrator | 2025-09-23 19:22:59.811058 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2025-09-23 19:22:59.811062 | orchestrator | Tuesday 23 September 2025 19:22:02 +0000 (0:00:02.250) 0:09:48.058 ***** 2025-09-23 19:22:59.811066 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-23 19:22:59.811070 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-23 19:22:59.811075 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.811079 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-23 19:22:59.811094 | orchestrator | skipping: [testbed-node-4] => (item=None)  2025-09-23 19:22:59.811099 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.811103 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-23 19:22:59.811107 | orchestrator | skipping: [testbed-node-5] => (item=None)  2025-09-23 19:22:59.811115 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.811119 | orchestrator | 2025-09-23 19:22:59.811123 | orchestrator | TASK [ceph-rgw : Copy SSL certificate & key data to certificate path] ********** 2025-09-23 19:22:59.811127 | orchestrator | Tuesday 23 September 2025 19:22:03 +0000 (0:00:01.236) 0:09:49.294 ***** 2025-09-23 19:22:59.811131 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811135 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.811139 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.811143 | orchestrator | 2025-09-23 19:22:59.811147 | orchestrator | TASK [ceph-rgw : Include_tasks pre_requisite.yml] ****************************** 2025-09-23 19:22:59.811152 | orchestrator | Tuesday 23 September 2025 19:22:04 +0000 (0:00:00.335) 0:09:49.630 ***** 2025-09-23 19:22:59.811156 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/pre_requisite.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.811160 | orchestrator | 2025-09-23 19:22:59.811164 | orchestrator | TASK [ceph-rgw : Create rados gateway directories] ***************************** 2025-09-23 19:22:59.811168 | orchestrator | Tuesday 23 September 2025 19:22:04 +0000 (0:00:00.768) 0:09:50.398 ***** 2025-09-23 19:22:59.811172 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.811177 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.811181 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.811185 | orchestrator | 2025-09-23 19:22:59.811189 | orchestrator | TASK [ceph-rgw : Create rgw keyrings] ****************************************** 2025-09-23 19:22:59.811193 | orchestrator | Tuesday 23 September 2025 19:22:05 +0000 (0:00:00.809) 0:09:51.208 ***** 2025-09-23 19:22:59.811197 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811201 | orchestrator | changed: [testbed-node-4 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2025-09-23 19:22:59.811206 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811252 | orchestrator | changed: [testbed-node-3 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2025-09-23 19:22:59.811260 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811265 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2025-09-23 19:22:59.811269 | orchestrator | 2025-09-23 19:22:59.811273 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2025-09-23 19:22:59.811277 | orchestrator | Tuesday 23 September 2025 19:22:09 +0000 (0:00:04.302) 0:09:55.510 ***** 2025-09-23 19:22:59.811281 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811285 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:22:59.811289 | orchestrator | ok: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811293 | orchestrator | ok: [testbed-node-4 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:22:59.811297 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:22:59.811302 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:22:59.811306 | orchestrator | 2025-09-23 19:22:59.811310 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2025-09-23 19:22:59.811314 | orchestrator | Tuesday 23 September 2025 19:22:12 +0000 (0:00:02.978) 0:09:58.489 ***** 2025-09-23 19:22:59.811318 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-23 19:22:59.811326 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.811330 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-23 19:22:59.811334 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.811338 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-23 19:22:59.811342 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.811347 | orchestrator | 2025-09-23 19:22:59.811351 | orchestrator | TASK [ceph-rgw : Rgw pool creation tasks] ************************************** 2025-09-23 19:22:59.811360 | orchestrator | Tuesday 23 September 2025 19:22:14 +0000 (0:00:01.289) 0:09:59.779 ***** 2025-09-23 19:22:59.811365 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/rgw_create_pools.yml for testbed-node-3 2025-09-23 19:22:59.811369 | orchestrator | 2025-09-23 19:22:59.811373 | orchestrator | TASK [ceph-rgw : Create ec profile] ******************************************** 2025-09-23 19:22:59.811377 | orchestrator | Tuesday 23 September 2025 19:22:14 +0000 (0:00:00.232) 0:10:00.012 ***** 2025-09-23 19:22:59.811381 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811386 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811390 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811394 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811398 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811402 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811406 | orchestrator | 2025-09-23 19:22:59.811410 | orchestrator | TASK [ceph-rgw : Set crush rule] *********************************************** 2025-09-23 19:22:59.811414 | orchestrator | Tuesday 23 September 2025 19:22:14 +0000 (0:00:00.575) 0:10:00.588 ***** 2025-09-23 19:22:59.811418 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811423 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811427 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811431 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811435 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-23 19:22:59.811439 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811443 | orchestrator | 2025-09-23 19:22:59.811447 | orchestrator | TASK [ceph-rgw : Create rgw pools] ********************************************* 2025-09-23 19:22:59.811451 | orchestrator | Tuesday 23 September 2025 19:22:15 +0000 (0:00:00.572) 0:10:01.160 ***** 2025-09-23 19:22:59.811456 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-23 19:22:59.811460 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-23 19:22:59.811464 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-23 19:22:59.811468 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-23 19:22:59.811476 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-23 19:22:59.811480 | orchestrator | 2025-09-23 19:22:59.811484 | orchestrator | TASK [ceph-rgw : Include_tasks openstack-keystone.yml] ************************* 2025-09-23 19:22:59.811488 | orchestrator | Tuesday 23 September 2025 19:22:46 +0000 (0:00:30.486) 0:10:31.647 ***** 2025-09-23 19:22:59.811492 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811496 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.811500 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.811504 | orchestrator | 2025-09-23 19:22:59.811508 | orchestrator | TASK [ceph-rgw : Include_tasks start_radosgw.yml] ****************************** 2025-09-23 19:22:59.811512 | orchestrator | Tuesday 23 September 2025 19:22:46 +0000 (0:00:00.288) 0:10:31.935 ***** 2025-09-23 19:22:59.811517 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811521 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.811525 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.811529 | orchestrator | 2025-09-23 19:22:59.811533 | orchestrator | TASK [ceph-rgw : Include start_docker_rgw.yml] ********************************* 2025-09-23 19:22:59.811537 | orchestrator | Tuesday 23 September 2025 19:22:46 +0000 (0:00:00.550) 0:10:32.485 ***** 2025-09-23 19:22:59.811541 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/start_docker_rgw.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.811545 | orchestrator | 2025-09-23 19:22:59.811549 | orchestrator | TASK [ceph-rgw : Include_task systemd.yml] ************************************* 2025-09-23 19:22:59.811553 | orchestrator | Tuesday 23 September 2025 19:22:47 +0000 (0:00:00.526) 0:10:33.011 ***** 2025-09-23 19:22:59.811558 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.811562 | orchestrator | 2025-09-23 19:22:59.811568 | orchestrator | TASK [ceph-rgw : Generate systemd unit file] *********************************** 2025-09-23 19:22:59.811575 | orchestrator | Tuesday 23 September 2025 19:22:48 +0000 (0:00:00.684) 0:10:33.696 ***** 2025-09-23 19:22:59.811579 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.811583 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.811587 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.811591 | orchestrator | 2025-09-23 19:22:59.811595 | orchestrator | TASK [ceph-rgw : Generate systemd ceph-radosgw target file] ******************** 2025-09-23 19:22:59.811600 | orchestrator | Tuesday 23 September 2025 19:22:49 +0000 (0:00:01.345) 0:10:35.041 ***** 2025-09-23 19:22:59.811604 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.811608 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.811612 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.811616 | orchestrator | 2025-09-23 19:22:59.811620 | orchestrator | TASK [ceph-rgw : Enable ceph-radosgw.target] *********************************** 2025-09-23 19:22:59.811624 | orchestrator | Tuesday 23 September 2025 19:22:50 +0000 (0:00:01.213) 0:10:36.254 ***** 2025-09-23 19:22:59.811628 | orchestrator | changed: [testbed-node-4] 2025-09-23 19:22:59.811632 | orchestrator | changed: [testbed-node-3] 2025-09-23 19:22:59.811636 | orchestrator | changed: [testbed-node-5] 2025-09-23 19:22:59.811640 | orchestrator | 2025-09-23 19:22:59.811644 | orchestrator | TASK [ceph-rgw : Systemd start rgw container] ********************************** 2025-09-23 19:22:59.811648 | orchestrator | Tuesday 23 September 2025 19:22:52 +0000 (0:00:01.648) 0:10:37.903 ***** 2025-09-23 19:22:59.811652 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.811657 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.811661 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-23 19:22:59.811665 | orchestrator | 2025-09-23 19:22:59.811669 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-23 19:22:59.811676 | orchestrator | Tuesday 23 September 2025 19:22:54 +0000 (0:00:02.579) 0:10:40.482 ***** 2025-09-23 19:22:59.811680 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811685 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.811689 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.811693 | orchestrator | 2025-09-23 19:22:59.811697 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2025-09-23 19:22:59.811701 | orchestrator | Tuesday 23 September 2025 19:22:55 +0000 (0:00:00.324) 0:10:40.807 ***** 2025-09-23 19:22:59.811705 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:22:59.811709 | orchestrator | 2025-09-23 19:22:59.811713 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2025-09-23 19:22:59.811717 | orchestrator | Tuesday 23 September 2025 19:22:55 +0000 (0:00:00.793) 0:10:41.600 ***** 2025-09-23 19:22:59.811721 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.811725 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.811729 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.811733 | orchestrator | 2025-09-23 19:22:59.811738 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2025-09-23 19:22:59.811742 | orchestrator | Tuesday 23 September 2025 19:22:56 +0000 (0:00:00.315) 0:10:41.916 ***** 2025-09-23 19:22:59.811746 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811750 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:22:59.811754 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:22:59.811758 | orchestrator | 2025-09-23 19:22:59.811762 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2025-09-23 19:22:59.811766 | orchestrator | Tuesday 23 September 2025 19:22:56 +0000 (0:00:00.322) 0:10:42.238 ***** 2025-09-23 19:22:59.811770 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:22:59.811774 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:22:59.811778 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:22:59.811782 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:22:59.811786 | orchestrator | 2025-09-23 19:22:59.811790 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2025-09-23 19:22:59.811794 | orchestrator | Tuesday 23 September 2025 19:22:57 +0000 (0:00:01.060) 0:10:43.299 ***** 2025-09-23 19:22:59.811799 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:22:59.811803 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:22:59.811807 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:22:59.811811 | orchestrator | 2025-09-23 19:22:59.811815 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:22:59.811819 | orchestrator | testbed-node-0 : ok=134  changed=35  unreachable=0 failed=0 skipped=125  rescued=0 ignored=0 2025-09-23 19:22:59.811823 | orchestrator | testbed-node-1 : ok=127  changed=31  unreachable=0 failed=0 skipped=120  rescued=0 ignored=0 2025-09-23 19:22:59.811827 | orchestrator | testbed-node-2 : ok=134  changed=33  unreachable=0 failed=0 skipped=119  rescued=0 ignored=0 2025-09-23 19:22:59.811832 | orchestrator | testbed-node-3 : ok=193  changed=45  unreachable=0 failed=0 skipped=162  rescued=0 ignored=0 2025-09-23 19:22:59.811836 | orchestrator | testbed-node-4 : ok=175  changed=40  unreachable=0 failed=0 skipped=123  rescued=0 ignored=0 2025-09-23 19:22:59.811845 | orchestrator | testbed-node-5 : ok=177  changed=41  unreachable=0 failed=0 skipped=121  rescued=0 ignored=0 2025-09-23 19:22:59.811849 | orchestrator | 2025-09-23 19:22:59.811853 | orchestrator | 2025-09-23 19:22:59.811858 | orchestrator | 2025-09-23 19:22:59.811862 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:22:59.811870 | orchestrator | Tuesday 23 September 2025 19:22:57 +0000 (0:00:00.276) 0:10:43.575 ***** 2025-09-23 19:22:59.811874 | orchestrator | =============================================================================== 2025-09-23 19:22:59.811878 | orchestrator | ceph-osd : Use ceph-volume to create osds ------------------------------ 41.56s 2025-09-23 19:22:59.811882 | orchestrator | ceph-container-common : Pulling Ceph container image ------------------- 39.99s 2025-09-23 19:22:59.811886 | orchestrator | ceph-mgr : Wait for all mgr to be up ----------------------------------- 36.90s 2025-09-23 19:22:59.811890 | orchestrator | ceph-rgw : Create rgw pools -------------------------------------------- 30.49s 2025-09-23 19:22:59.811894 | orchestrator | ceph-mon : Waiting for the monitor(s) to form the quorum... ------------ 22.01s 2025-09-23 19:22:59.811898 | orchestrator | ceph-mon : Set cluster configs ----------------------------------------- 14.90s 2025-09-23 19:22:59.811902 | orchestrator | ceph-osd : Wait for all osd to be up ----------------------------------- 12.67s 2025-09-23 19:22:59.811906 | orchestrator | ceph-mgr : Create ceph mgr keyring(s) on a mon node -------------------- 10.75s 2025-09-23 19:22:59.811910 | orchestrator | ceph-mon : Fetch ceph initial keys -------------------------------------- 9.07s 2025-09-23 19:22:59.811914 | orchestrator | ceph-mds : Create filesystem pools -------------------------------------- 8.24s 2025-09-23 19:22:59.811918 | orchestrator | ceph-mgr : Disable ceph mgr enabled modules ----------------------------- 6.45s 2025-09-23 19:22:59.811922 | orchestrator | ceph-config : Create ceph initial directories --------------------------- 6.35s 2025-09-23 19:22:59.811926 | orchestrator | ceph-mgr : Add modules to ceph-mgr -------------------------------------- 4.71s 2025-09-23 19:22:59.811930 | orchestrator | ceph-rgw : Create rgw keyrings ------------------------------------------ 4.30s 2025-09-23 19:22:59.811935 | orchestrator | ceph-crash : Create client.crash keyring -------------------------------- 3.91s 2025-09-23 19:22:59.811939 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 3.76s 2025-09-23 19:22:59.811943 | orchestrator | ceph-mds : Create ceph filesystem --------------------------------------- 3.72s 2025-09-23 19:22:59.811947 | orchestrator | ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created --- 3.71s 2025-09-23 19:22:59.811951 | orchestrator | ceph-mon : Copy admin keyring over to mons ------------------------------ 3.66s 2025-09-23 19:22:59.811955 | orchestrator | ceph-crash : Start the ceph-crash service ------------------------------- 3.50s 2025-09-23 19:23:02.841997 | orchestrator | 2025-09-23 19:23:02 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:02.843390 | orchestrator | 2025-09-23 19:23:02 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:02.845139 | orchestrator | 2025-09-23 19:23:02 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:02.845387 | orchestrator | 2025-09-23 19:23:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:05.885751 | orchestrator | 2025-09-23 19:23:05 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:05.887660 | orchestrator | 2025-09-23 19:23:05 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:05.889854 | orchestrator | 2025-09-23 19:23:05 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:05.889906 | orchestrator | 2025-09-23 19:23:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:08.931607 | orchestrator | 2025-09-23 19:23:08 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:08.931843 | orchestrator | 2025-09-23 19:23:08 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:08.933042 | orchestrator | 2025-09-23 19:23:08 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:08.933063 | orchestrator | 2025-09-23 19:23:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:11.987584 | orchestrator | 2025-09-23 19:23:11 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:11.987863 | orchestrator | 2025-09-23 19:23:11 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:11.988586 | orchestrator | 2025-09-23 19:23:11 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:11.988622 | orchestrator | 2025-09-23 19:23:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:15.026802 | orchestrator | 2025-09-23 19:23:15 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:15.026890 | orchestrator | 2025-09-23 19:23:15 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:15.028359 | orchestrator | 2025-09-23 19:23:15 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:15.028423 | orchestrator | 2025-09-23 19:23:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:18.062304 | orchestrator | 2025-09-23 19:23:18 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:18.063065 | orchestrator | 2025-09-23 19:23:18 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:18.065111 | orchestrator | 2025-09-23 19:23:18 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:18.065149 | orchestrator | 2025-09-23 19:23:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:21.164205 | orchestrator | 2025-09-23 19:23:21 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:21.165240 | orchestrator | 2025-09-23 19:23:21 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:21.166137 | orchestrator | 2025-09-23 19:23:21 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:21.166231 | orchestrator | 2025-09-23 19:23:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:24.204210 | orchestrator | 2025-09-23 19:23:24 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:24.204309 | orchestrator | 2025-09-23 19:23:24 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:24.206708 | orchestrator | 2025-09-23 19:23:24 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:24.206739 | orchestrator | 2025-09-23 19:23:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:27.257190 | orchestrator | 2025-09-23 19:23:27 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:27.259008 | orchestrator | 2025-09-23 19:23:27 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:27.259931 | orchestrator | 2025-09-23 19:23:27 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:27.259964 | orchestrator | 2025-09-23 19:23:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:30.294937 | orchestrator | 2025-09-23 19:23:30 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:30.295598 | orchestrator | 2025-09-23 19:23:30 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state STARTED 2025-09-23 19:23:30.302334 | orchestrator | 2025-09-23 19:23:30 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:30.302429 | orchestrator | 2025-09-23 19:23:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:33.333679 | orchestrator | 2025-09-23 19:23:33 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:33.338534 | orchestrator | 2025-09-23 19:23:33.338598 | orchestrator | 2025-09-23 19:23:33.338608 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:23:33.338617 | orchestrator | 2025-09-23 19:23:33.338624 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:23:33.338632 | orchestrator | Tuesday 23 September 2025 19:23:00 +0000 (0:00:00.278) 0:00:00.278 ***** 2025-09-23 19:23:33.338639 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.338647 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.338655 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.338662 | orchestrator | 2025-09-23 19:23:33.338669 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:23:33.338676 | orchestrator | Tuesday 23 September 2025 19:23:00 +0000 (0:00:00.339) 0:00:00.617 ***** 2025-09-23 19:23:33.338683 | orchestrator | ok: [testbed-node-0] => (item=enable_horizon_True) 2025-09-23 19:23:33.338691 | orchestrator | ok: [testbed-node-1] => (item=enable_horizon_True) 2025-09-23 19:23:33.338698 | orchestrator | ok: [testbed-node-2] => (item=enable_horizon_True) 2025-09-23 19:23:33.338705 | orchestrator | 2025-09-23 19:23:33.338711 | orchestrator | PLAY [Apply role horizon] ****************************************************** 2025-09-23 19:23:33.338718 | orchestrator | 2025-09-23 19:23:33.338725 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-23 19:23:33.338743 | orchestrator | Tuesday 23 September 2025 19:23:01 +0000 (0:00:00.538) 0:00:01.155 ***** 2025-09-23 19:23:33.338750 | orchestrator | included: /ansible/roles/horizon/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:23:33.338758 | orchestrator | 2025-09-23 19:23:33.338765 | orchestrator | TASK [horizon : Ensuring config directories exist] ***************************** 2025-09-23 19:23:33.338772 | orchestrator | Tuesday 23 September 2025 19:23:01 +0000 (0:00:00.516) 0:00:01.672 ***** 2025-09-23 19:23:33.338798 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.338847 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.338860 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.338874 | orchestrator | 2025-09-23 19:23:33.338882 | orchestrator | TASK [horizon : Set empty custom policy] *************************************** 2025-09-23 19:23:33.338889 | orchestrator | Tuesday 23 September 2025 19:23:03 +0000 (0:00:01.370) 0:00:03.043 ***** 2025-09-23 19:23:33.339115 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339127 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339133 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339140 | orchestrator | 2025-09-23 19:23:33.339147 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-23 19:23:33.339153 | orchestrator | Tuesday 23 September 2025 19:23:03 +0000 (0:00:00.431) 0:00:03.474 ***** 2025-09-23 19:23:33.339160 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'cloudkitty', 'enabled': False})  2025-09-23 19:23:33.339173 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'heat', 'enabled': 'no'})  2025-09-23 19:23:33.339180 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'ironic', 'enabled': False})  2025-09-23 19:23:33.339187 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'masakari', 'enabled': False})  2025-09-23 19:23:33.339193 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'mistral', 'enabled': False})  2025-09-23 19:23:33.339200 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'tacker', 'enabled': False})  2025-09-23 19:23:33.339206 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'trove', 'enabled': False})  2025-09-23 19:23:33.339213 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'watcher', 'enabled': False})  2025-09-23 19:23:33.339219 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'cloudkitty', 'enabled': False})  2025-09-23 19:23:33.339225 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'heat', 'enabled': 'no'})  2025-09-23 19:23:33.339232 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'ironic', 'enabled': False})  2025-09-23 19:23:33.339238 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'masakari', 'enabled': False})  2025-09-23 19:23:33.339245 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'mistral', 'enabled': False})  2025-09-23 19:23:33.339252 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'tacker', 'enabled': False})  2025-09-23 19:23:33.339258 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'trove', 'enabled': False})  2025-09-23 19:23:33.339264 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'watcher', 'enabled': False})  2025-09-23 19:23:33.339271 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'cloudkitty', 'enabled': False})  2025-09-23 19:23:33.339277 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'heat', 'enabled': 'no'})  2025-09-23 19:23:33.339284 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'ironic', 'enabled': False})  2025-09-23 19:23:33.339295 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'masakari', 'enabled': False})  2025-09-23 19:23:33.339302 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'mistral', 'enabled': False})  2025-09-23 19:23:33.339308 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'tacker', 'enabled': False})  2025-09-23 19:23:33.339315 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'trove', 'enabled': False})  2025-09-23 19:23:33.339321 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'watcher', 'enabled': False})  2025-09-23 19:23:33.339329 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'ceilometer', 'enabled': 'yes'}) 2025-09-23 19:23:33.339344 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'cinder', 'enabled': 'yes'}) 2025-09-23 19:23:33.339351 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'designate', 'enabled': True}) 2025-09-23 19:23:33.339358 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'glance', 'enabled': True}) 2025-09-23 19:23:33.339364 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'keystone', 'enabled': True}) 2025-09-23 19:23:33.339371 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'magnum', 'enabled': True}) 2025-09-23 19:23:33.339377 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'manila', 'enabled': True}) 2025-09-23 19:23:33.339384 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'neutron', 'enabled': True}) 2025-09-23 19:23:33.339390 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'nova', 'enabled': True}) 2025-09-23 19:23:33.339398 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'octavia', 'enabled': True}) 2025-09-23 19:23:33.339404 | orchestrator | 2025-09-23 19:23:33.339411 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.339418 | orchestrator | Tuesday 23 September 2025 19:23:04 +0000 (0:00:00.742) 0:00:04.216 ***** 2025-09-23 19:23:33.339424 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339431 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339437 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339444 | orchestrator | 2025-09-23 19:23:33.339450 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.339457 | orchestrator | Tuesday 23 September 2025 19:23:04 +0000 (0:00:00.308) 0:00:04.525 ***** 2025-09-23 19:23:33.339463 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339470 | orchestrator | 2025-09-23 19:23:33.339480 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.339487 | orchestrator | Tuesday 23 September 2025 19:23:04 +0000 (0:00:00.123) 0:00:04.648 ***** 2025-09-23 19:23:33.339493 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339500 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.339506 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.339513 | orchestrator | 2025-09-23 19:23:33.339519 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.339526 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.445) 0:00:05.094 ***** 2025-09-23 19:23:33.339532 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339539 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339546 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339552 | orchestrator | 2025-09-23 19:23:33.339559 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.339565 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.314) 0:00:05.409 ***** 2025-09-23 19:23:33.339572 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339578 | orchestrator | 2025-09-23 19:23:33.339585 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.339591 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.117) 0:00:05.527 ***** 2025-09-23 19:23:33.339598 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339604 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.339611 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.339621 | orchestrator | 2025-09-23 19:23:33.339628 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.339635 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.289) 0:00:05.816 ***** 2025-09-23 19:23:33.339641 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339648 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339654 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339661 | orchestrator | 2025-09-23 19:23:33.339667 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.339674 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.333) 0:00:06.149 ***** 2025-09-23 19:23:33.339681 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339687 | orchestrator | 2025-09-23 19:23:33.339694 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.339703 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.125) 0:00:06.274 ***** 2025-09-23 19:23:33.339711 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339718 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.339725 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.339733 | orchestrator | 2025-09-23 19:23:33.339740 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.339747 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.501) 0:00:06.776 ***** 2025-09-23 19:23:33.339755 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339762 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339770 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339777 | orchestrator | 2025-09-23 19:23:33.339784 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.339791 | orchestrator | Tuesday 23 September 2025 19:23:07 +0000 (0:00:00.286) 0:00:07.063 ***** 2025-09-23 19:23:33.339799 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339806 | orchestrator | 2025-09-23 19:23:33.339813 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.339820 | orchestrator | Tuesday 23 September 2025 19:23:07 +0000 (0:00:00.125) 0:00:07.188 ***** 2025-09-23 19:23:33.339828 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339835 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.339842 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.339849 | orchestrator | 2025-09-23 19:23:33.339856 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.339864 | orchestrator | Tuesday 23 September 2025 19:23:07 +0000 (0:00:00.299) 0:00:07.487 ***** 2025-09-23 19:23:33.339871 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339878 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339884 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339891 | orchestrator | 2025-09-23 19:23:33.339898 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.339904 | orchestrator | Tuesday 23 September 2025 19:23:07 +0000 (0:00:00.296) 0:00:07.784 ***** 2025-09-23 19:23:33.339911 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339917 | orchestrator | 2025-09-23 19:23:33.339924 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.339930 | orchestrator | Tuesday 23 September 2025 19:23:08 +0000 (0:00:00.308) 0:00:08.093 ***** 2025-09-23 19:23:33.339937 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.339943 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.339950 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.339956 | orchestrator | 2025-09-23 19:23:33.339963 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.339969 | orchestrator | Tuesday 23 September 2025 19:23:08 +0000 (0:00:00.284) 0:00:08.378 ***** 2025-09-23 19:23:33.339976 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.339982 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.339989 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.339995 | orchestrator | 2025-09-23 19:23:33.340007 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.340013 | orchestrator | Tuesday 23 September 2025 19:23:08 +0000 (0:00:00.315) 0:00:08.694 ***** 2025-09-23 19:23:33.340020 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340027 | orchestrator | 2025-09-23 19:23:33.340033 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.340040 | orchestrator | Tuesday 23 September 2025 19:23:09 +0000 (0:00:00.121) 0:00:08.815 ***** 2025-09-23 19:23:33.340046 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340053 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340059 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340083 | orchestrator | 2025-09-23 19:23:33.340090 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.340100 | orchestrator | Tuesday 23 September 2025 19:23:09 +0000 (0:00:00.296) 0:00:09.111 ***** 2025-09-23 19:23:33.340107 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.340114 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.340120 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.340127 | orchestrator | 2025-09-23 19:23:33.340133 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.340140 | orchestrator | Tuesday 23 September 2025 19:23:09 +0000 (0:00:00.546) 0:00:09.658 ***** 2025-09-23 19:23:33.340146 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340153 | orchestrator | 2025-09-23 19:23:33.340160 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.340166 | orchestrator | Tuesday 23 September 2025 19:23:10 +0000 (0:00:00.194) 0:00:09.853 ***** 2025-09-23 19:23:33.340173 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340179 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340186 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340192 | orchestrator | 2025-09-23 19:23:33.340199 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.340205 | orchestrator | Tuesday 23 September 2025 19:23:10 +0000 (0:00:00.483) 0:00:10.336 ***** 2025-09-23 19:23:33.340212 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.340218 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.340225 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.340231 | orchestrator | 2025-09-23 19:23:33.340238 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.340244 | orchestrator | Tuesday 23 September 2025 19:23:10 +0000 (0:00:00.399) 0:00:10.736 ***** 2025-09-23 19:23:33.340251 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340257 | orchestrator | 2025-09-23 19:23:33.340264 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.340270 | orchestrator | Tuesday 23 September 2025 19:23:11 +0000 (0:00:00.139) 0:00:10.876 ***** 2025-09-23 19:23:33.340277 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340283 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340290 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340296 | orchestrator | 2025-09-23 19:23:33.340303 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.340309 | orchestrator | Tuesday 23 September 2025 19:23:11 +0000 (0:00:00.305) 0:00:11.181 ***** 2025-09-23 19:23:33.340316 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.340322 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.340329 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.340335 | orchestrator | 2025-09-23 19:23:33.340345 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.340352 | orchestrator | Tuesday 23 September 2025 19:23:12 +0000 (0:00:00.666) 0:00:11.848 ***** 2025-09-23 19:23:33.340359 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340365 | orchestrator | 2025-09-23 19:23:33.340372 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.340379 | orchestrator | Tuesday 23 September 2025 19:23:12 +0000 (0:00:00.187) 0:00:12.035 ***** 2025-09-23 19:23:33.340390 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340396 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340403 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340409 | orchestrator | 2025-09-23 19:23:33.340416 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-23 19:23:33.340422 | orchestrator | Tuesday 23 September 2025 19:23:12 +0000 (0:00:00.293) 0:00:12.329 ***** 2025-09-23 19:23:33.340429 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:33.340436 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:33.340442 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:33.340448 | orchestrator | 2025-09-23 19:23:33.340455 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-23 19:23:33.340462 | orchestrator | Tuesday 23 September 2025 19:23:12 +0000 (0:00:00.309) 0:00:12.638 ***** 2025-09-23 19:23:33.340468 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340475 | orchestrator | 2025-09-23 19:23:33.340481 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-23 19:23:33.340487 | orchestrator | Tuesday 23 September 2025 19:23:12 +0000 (0:00:00.140) 0:00:12.778 ***** 2025-09-23 19:23:33.340494 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340500 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340507 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340513 | orchestrator | 2025-09-23 19:23:33.340520 | orchestrator | TASK [horizon : Copying over config.json files for services] ******************* 2025-09-23 19:23:33.340526 | orchestrator | Tuesday 23 September 2025 19:23:13 +0000 (0:00:00.491) 0:00:13.270 ***** 2025-09-23 19:23:33.340533 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:23:33.340539 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:23:33.340546 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:23:33.340552 | orchestrator | 2025-09-23 19:23:33.340559 | orchestrator | TASK [horizon : Copying over horizon.conf] ************************************* 2025-09-23 19:23:33.340565 | orchestrator | Tuesday 23 September 2025 19:23:15 +0000 (0:00:01.615) 0:00:14.886 ***** 2025-09-23 19:23:33.340572 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2025-09-23 19:23:33.340578 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2025-09-23 19:23:33.340585 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2025-09-23 19:23:33.340591 | orchestrator | 2025-09-23 19:23:33.340598 | orchestrator | TASK [horizon : Copying over kolla-settings.py] ******************************** 2025-09-23 19:23:33.340604 | orchestrator | Tuesday 23 September 2025 19:23:17 +0000 (0:00:02.253) 0:00:17.139 ***** 2025-09-23 19:23:33.340611 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2025-09-23 19:23:33.340617 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2025-09-23 19:23:33.340624 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2025-09-23 19:23:33.340630 | orchestrator | 2025-09-23 19:23:33.340637 | orchestrator | TASK [horizon : Copying over custom-settings.py] ******************************* 2025-09-23 19:23:33.340646 | orchestrator | Tuesday 23 September 2025 19:23:20 +0000 (0:00:02.889) 0:00:20.029 ***** 2025-09-23 19:23:33.340653 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2025-09-23 19:23:33.340660 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2025-09-23 19:23:33.340666 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2025-09-23 19:23:33.340673 | orchestrator | 2025-09-23 19:23:33.340679 | orchestrator | TASK [horizon : Copying over existing policy file] ***************************** 2025-09-23 19:23:33.340686 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:01.959) 0:00:21.988 ***** 2025-09-23 19:23:33.340700 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340707 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340713 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340720 | orchestrator | 2025-09-23 19:23:33.340726 | orchestrator | TASK [horizon : Copying over custom themes] ************************************ 2025-09-23 19:23:33.340733 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:00.306) 0:00:22.294 ***** 2025-09-23 19:23:33.340739 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340746 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340752 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340758 | orchestrator | 2025-09-23 19:23:33.340765 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-23 19:23:33.340772 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:00.286) 0:00:22.581 ***** 2025-09-23 19:23:33.340778 | orchestrator | included: /ansible/roles/horizon/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:23:33.340785 | orchestrator | 2025-09-23 19:23:33.340791 | orchestrator | TASK [service-cert-copy : horizon | Copying over extra CA certificates] ******** 2025-09-23 19:23:33.340798 | orchestrator | Tuesday 23 September 2025 19:23:23 +0000 (0:00:00.612) 0:00:23.194 ***** 2025-09-23 19:23:33.340809 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.340824 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.340844 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.340852 | orchestrator | 2025-09-23 19:23:33.340859 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS certificate] *** 2025-09-23 19:23:33.340865 | orchestrator | Tuesday 23 September 2025 19:23:25 +0000 (0:00:02.115) 0:00:25.309 ***** 2025-09-23 19:23:33.340882 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:23:33.340894 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340905 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']2025-09-23 19:23:33 | INFO  | Task b819efdc-ce7d-4a96-ad24-183c8bf2b186 is in state SUCCESS 2025-09-23 19:23:33.340917 | orchestrator | }, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:23:33.340924 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.340935 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:23:33.340942 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.340949 | orchestrator | 2025-09-23 19:23:33.340956 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS key] ***** 2025-09-23 19:23:33.340962 | orchestrator | Tuesday 23 September 2025 19:23:26 +0000 (0:00:00.740) 0:00:26.050 ***** 2025-09-23 19:23:33.340975 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:23:33.340987 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.340998 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:23:33.341005 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.341018 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-23 19:23:33.341061 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.341088 | orchestrator | 2025-09-23 19:23:33.341095 | orchestrator | TASK [horizon : Deploy horizon container] ************************************** 2025-09-23 19:23:33.341102 | orchestrator | Tuesday 23 September 2025 19:23:27 +0000 (0:00:00.818) 0:00:26.868 ***** 2025-09-23 19:23:33.341114 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.341127 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.341143 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-23 19:23:33.341155 | orchestrator | 2025-09-23 19:23:33.341162 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-23 19:23:33.341168 | orchestrator | Tuesday 23 September 2025 19:23:29 +0000 (0:00:02.018) 0:00:28.886 ***** 2025-09-23 19:23:33.341175 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:33.341181 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:33.341188 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:33.341194 | orchestrator | 2025-09-23 19:23:33.341201 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-23 19:23:33.341208 | orchestrator | Tuesday 23 September 2025 19:23:29 +0000 (0:00:00.307) 0:00:29.194 ***** 2025-09-23 19:23:33.341214 | orchestrator | included: /ansible/roles/horizon/tasks/bootstrap.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:23:33.341221 | orchestrator | 2025-09-23 19:23:33.341231 | orchestrator | TASK [horizon : Creating Horizon database] ************************************* 2025-09-23 19:23:33.341238 | orchestrator | Tuesday 23 September 2025 19:23:29 +0000 (0:00:00.519) 0:00:29.713 ***** 2025-09-23 19:23:33.341245 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:23:33.341251 | orchestrator | 2025-09-23 19:23:33.341258 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:23:33.341265 | orchestrator | testbed-node-0 : ok=33  changed=7  unreachable=0 failed=1  skipped=25  rescued=0 ignored=0 2025-09-23 19:23:33.341271 | orchestrator | testbed-node-1 : ok=33  changed=7  unreachable=0 failed=0 skipped=15  rescued=0 ignored=0 2025-09-23 19:23:33.341278 | orchestrator | testbed-node-2 : ok=33  changed=7  unreachable=0 failed=0 skipped=15  rescued=0 ignored=0 2025-09-23 19:23:33.341285 | orchestrator | 2025-09-23 19:23:33.341291 | orchestrator | 2025-09-23 19:23:33.341298 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:23:33.341304 | orchestrator | Tuesday 23 September 2025 19:23:30 +0000 (0:00:00.804) 0:00:30.518 ***** 2025-09-23 19:23:33.341311 | orchestrator | =============================================================================== 2025-09-23 19:23:33.341317 | orchestrator | horizon : Copying over kolla-settings.py -------------------------------- 2.89s 2025-09-23 19:23:33.341324 | orchestrator | horizon : Copying over horizon.conf ------------------------------------- 2.25s 2025-09-23 19:23:33.341330 | orchestrator | service-cert-copy : horizon | Copying over extra CA certificates -------- 2.12s 2025-09-23 19:23:33.341337 | orchestrator | horizon : Deploy horizon container -------------------------------------- 2.02s 2025-09-23 19:23:33.341343 | orchestrator | horizon : Copying over custom-settings.py ------------------------------- 1.96s 2025-09-23 19:23:33.341350 | orchestrator | horizon : Copying over config.json files for services ------------------- 1.62s 2025-09-23 19:23:33.341360 | orchestrator | horizon : Ensuring config directories exist ----------------------------- 1.37s 2025-09-23 19:23:33.341367 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS key ----- 0.82s 2025-09-23 19:23:33.341373 | orchestrator | horizon : Creating Horizon database ------------------------------------- 0.80s 2025-09-23 19:23:33.341380 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.74s 2025-09-23 19:23:33.341386 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS certificate --- 0.74s 2025-09-23 19:23:33.341393 | orchestrator | horizon : Update policy file name --------------------------------------- 0.67s 2025-09-23 19:23:33.341399 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.61s 2025-09-23 19:23:33.341406 | orchestrator | horizon : Update policy file name --------------------------------------- 0.55s 2025-09-23 19:23:33.341412 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.54s 2025-09-23 19:23:33.341423 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.52s 2025-09-23 19:23:33.341430 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.52s 2025-09-23 19:23:33.341436 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.50s 2025-09-23 19:23:33.341443 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.49s 2025-09-23 19:23:33.341449 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.48s 2025-09-23 19:23:33.341456 | orchestrator | 2025-09-23 19:23:33 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:33.341463 | orchestrator | 2025-09-23 19:23:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:36.383253 | orchestrator | 2025-09-23 19:23:36 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:36.383734 | orchestrator | 2025-09-23 19:23:36 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:36.384137 | orchestrator | 2025-09-23 19:23:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:39.428554 | orchestrator | 2025-09-23 19:23:39 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:39.430712 | orchestrator | 2025-09-23 19:23:39 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:39.430897 | orchestrator | 2025-09-23 19:23:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:42.471930 | orchestrator | 2025-09-23 19:23:42 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:42.474488 | orchestrator | 2025-09-23 19:23:42 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:42.474547 | orchestrator | 2025-09-23 19:23:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:45.515637 | orchestrator | 2025-09-23 19:23:45 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:45.516867 | orchestrator | 2025-09-23 19:23:45 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state STARTED 2025-09-23 19:23:45.516900 | orchestrator | 2025-09-23 19:23:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:48.555576 | orchestrator | 2025-09-23 19:23:48 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:23:48.558452 | orchestrator | 2025-09-23 19:23:48 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:23:48.559813 | orchestrator | 2025-09-23 19:23:48 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:48.560584 | orchestrator | 2025-09-23 19:23:48 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:23:48.561366 | orchestrator | 2025-09-23 19:23:48 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:23:48.563540 | orchestrator | 2025-09-23 19:23:48.563575 | orchestrator | 2025-09-23 19:23:48 | INFO  | Task 052471ae-b976-4e87-b719-7cf866ad3295 is in state SUCCESS 2025-09-23 19:23:48.565905 | orchestrator | 2025-09-23 19:23:48.565966 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:23:48.565987 | orchestrator | 2025-09-23 19:23:48.566005 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:23:48.566124 | orchestrator | Tuesday 23 September 2025 19:23:00 +0000 (0:00:00.366) 0:00:00.366 ***** 2025-09-23 19:23:48.566149 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:48.566168 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:48.566187 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:48.566239 | orchestrator | 2025-09-23 19:23:48.566257 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:23:48.566304 | orchestrator | Tuesday 23 September 2025 19:23:00 +0000 (0:00:00.411) 0:00:00.778 ***** 2025-09-23 19:23:48.566323 | orchestrator | ok: [testbed-node-0] => (item=enable_keystone_True) 2025-09-23 19:23:48.566341 | orchestrator | ok: [testbed-node-1] => (item=enable_keystone_True) 2025-09-23 19:23:48.566359 | orchestrator | ok: [testbed-node-2] => (item=enable_keystone_True) 2025-09-23 19:23:48.566377 | orchestrator | 2025-09-23 19:23:48.566407 | orchestrator | PLAY [Apply role keystone] ***************************************************** 2025-09-23 19:23:48.566427 | orchestrator | 2025-09-23 19:23:48.566447 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-23 19:23:48.566467 | orchestrator | Tuesday 23 September 2025 19:23:01 +0000 (0:00:00.420) 0:00:01.199 ***** 2025-09-23 19:23:48.566487 | orchestrator | included: /ansible/roles/keystone/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:23:48.566508 | orchestrator | 2025-09-23 19:23:48.566526 | orchestrator | TASK [keystone : Ensuring config directories exist] **************************** 2025-09-23 19:23:48.566545 | orchestrator | Tuesday 23 September 2025 19:23:01 +0000 (0:00:00.556) 0:00:01.755 ***** 2025-09-23 19:23:48.566573 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.566601 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.566643 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.566689 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.566715 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.566736 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.566760 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.566784 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.566805 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.566836 | orchestrator | 2025-09-23 19:23:48.566857 | orchestrator | TASK [keystone : Check if policies shall be overwritten] *********************** 2025-09-23 19:23:48.566877 | orchestrator | Tuesday 23 September 2025 19:23:03 +0000 (0:00:01.950) 0:00:03.705 ***** 2025-09-23 19:23:48.566911 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=/opt/configuration/environments/kolla/files/overlays/keystone/policy.yaml) 2025-09-23 19:23:48.566932 | orchestrator | 2025-09-23 19:23:48.566953 | orchestrator | TASK [keystone : Set keystone policy file] ************************************* 2025-09-23 19:23:48.566974 | orchestrator | Tuesday 23 September 2025 19:23:04 +0000 (0:00:00.872) 0:00:04.578 ***** 2025-09-23 19:23:48.566995 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:48.567015 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:48.567037 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:48.567086 | orchestrator | 2025-09-23 19:23:48.567105 | orchestrator | TASK [keystone : Check if Keystone domain-specific config is supplied] ********* 2025-09-23 19:23:48.567123 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.503) 0:00:05.082 ***** 2025-09-23 19:23:48.567139 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:23:48.567159 | orchestrator | 2025-09-23 19:23:48.567176 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-23 19:23:48.567194 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.669) 0:00:05.751 ***** 2025-09-23 19:23:48.567221 | orchestrator | included: /ansible/roles/keystone/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:23:48.567242 | orchestrator | 2025-09-23 19:23:48.567260 | orchestrator | TASK [service-cert-copy : keystone | Copying over extra CA certificates] ******* 2025-09-23 19:23:48.567279 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.520) 0:00:06.272 ***** 2025-09-23 19:23:48.567298 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.567320 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.567341 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.567394 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.567420 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.567440 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.567459 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.567477 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.567497 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.567529 | orchestrator | 2025-09-23 19:23:48.567549 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS certificate] *** 2025-09-23 19:23:48.567568 | orchestrator | Tuesday 23 September 2025 19:23:09 +0000 (0:00:03.103) 0:00:09.375 ***** 2025-09-23 19:23:48.567599 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:23:48.567627 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.567647 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:23:48.567666 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.567685 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:23:48.567718 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.567746 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:23:48.567765 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.567791 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:23:48.567810 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.567829 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:23:48.567848 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.567866 | orchestrator | 2025-09-23 19:23:48.567884 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS key] **** 2025-09-23 19:23:48.567914 | orchestrator | Tuesday 23 September 2025 19:23:10 +0000 (0:00:01.155) 0:00:10.530 ***** 2025-09-23 19:23:48.567933 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:23:48.567962 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.567981 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:23:48.568000 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.568025 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:23:48.568046 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.568125 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-23 19:23:48.568146 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:23:48.568176 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.568205 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-23 19:23:48.568225 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.568245 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.568264 | orchestrator | 2025-09-23 19:23:48.568282 | orchestrator | TASK [keystone : Copying over config.json files for services] ****************** 2025-09-23 19:23:48.568301 | orchestrator | Tuesday 23 September 2025 19:23:11 +0000 (0:00:00.862) 0:00:11.392 ***** 2025-09-23 19:23:48.568320 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.568351 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.568382 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.568409 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568430 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568449 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568478 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568498 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568517 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568536 | orchestrator | 2025-09-23 19:23:48.568553 | orchestrator | TASK [keystone : Copying over keystone.conf] *********************************** 2025-09-23 19:23:48.568572 | orchestrator | Tuesday 23 September 2025 19:23:14 +0000 (0:00:03.361) 0:00:14.754 ***** 2025-09-23 19:23:48.568610 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.568631 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.568660 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.568680 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.568709 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.568741 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.568762 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568783 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568815 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.568835 | orchestrator | 2025-09-23 19:23:48.568852 | orchestrator | TASK [keystone : Copying keystone-startup script for keystone] ***************** 2025-09-23 19:23:48.568871 | orchestrator | Tuesday 23 September 2025 19:23:21 +0000 (0:00:06.285) 0:00:21.039 ***** 2025-09-23 19:23:48.568890 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:23:48.568909 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:23:48.568928 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:23:48.568948 | orchestrator | 2025-09-23 19:23:48.568966 | orchestrator | TASK [keystone : Create Keystone domain-specific config directory] ************* 2025-09-23 19:23:48.568985 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:01.424) 0:00:22.463 ***** 2025-09-23 19:23:48.569005 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.569023 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.569043 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.569093 | orchestrator | 2025-09-23 19:23:48.569112 | orchestrator | TASK [keystone : Get file list in custom domains folder] *********************** 2025-09-23 19:23:48.569130 | orchestrator | Tuesday 23 September 2025 19:23:23 +0000 (0:00:00.703) 0:00:23.167 ***** 2025-09-23 19:23:48.569149 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.569168 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.569187 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.569205 | orchestrator | 2025-09-23 19:23:48.569224 | orchestrator | TASK [keystone : Copying Keystone Domain specific settings] ******************** 2025-09-23 19:23:48.569242 | orchestrator | Tuesday 23 September 2025 19:23:23 +0000 (0:00:00.460) 0:00:23.628 ***** 2025-09-23 19:23:48.569261 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.569279 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.569298 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.569315 | orchestrator | 2025-09-23 19:23:48.569335 | orchestrator | TASK [keystone : Copying over existing policy file] **************************** 2025-09-23 19:23:48.569353 | orchestrator | Tuesday 23 September 2025 19:23:24 +0000 (0:00:00.579) 0:00:24.207 ***** 2025-09-23 19:23:48.569395 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.569434 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.569460 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.569485 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.569509 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.569546 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-23 19:23:48.569589 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.569612 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.569635 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.569657 | orchestrator | 2025-09-23 19:23:48.569679 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-23 19:23:48.569701 | orchestrator | Tuesday 23 September 2025 19:23:26 +0000 (0:00:02.383) 0:00:26.590 ***** 2025-09-23 19:23:48.569724 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.569746 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.569768 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.569790 | orchestrator | 2025-09-23 19:23:48.569812 | orchestrator | TASK [keystone : Copying over wsgi-keystone.conf] ****************************** 2025-09-23 19:23:48.569832 | orchestrator | Tuesday 23 September 2025 19:23:27 +0000 (0:00:00.279) 0:00:26.870 ***** 2025-09-23 19:23:48.569855 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2025-09-23 19:23:48.569878 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2025-09-23 19:23:48.569901 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2025-09-23 19:23:48.569923 | orchestrator | 2025-09-23 19:23:48.569944 | orchestrator | TASK [keystone : Checking whether keystone-paste.ini file exists] ************** 2025-09-23 19:23:48.569963 | orchestrator | Tuesday 23 September 2025 19:23:28 +0000 (0:00:01.932) 0:00:28.803 ***** 2025-09-23 19:23:48.569981 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:23:48.569999 | orchestrator | 2025-09-23 19:23:48.570097 | orchestrator | TASK [keystone : Copying over keystone-paste.ini] ****************************** 2025-09-23 19:23:48.570127 | orchestrator | Tuesday 23 September 2025 19:23:29 +0000 (0:00:00.911) 0:00:29.714 ***** 2025-09-23 19:23:48.570149 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.570171 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.570193 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.570215 | orchestrator | 2025-09-23 19:23:48.570238 | orchestrator | TASK [keystone : Generate the required cron jobs for the node] ***************** 2025-09-23 19:23:48.570260 | orchestrator | Tuesday 23 September 2025 19:23:30 +0000 (0:00:00.875) 0:00:30.589 ***** 2025-09-23 19:23:48.570299 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:23:48.570320 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-23 19:23:48.570340 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-23 19:23:48.570357 | orchestrator | 2025-09-23 19:23:48.570376 | orchestrator | TASK [keystone : Set fact with the generated cron jobs for building the crontab later] *** 2025-09-23 19:23:48.570395 | orchestrator | Tuesday 23 September 2025 19:23:31 +0000 (0:00:01.098) 0:00:31.688 ***** 2025-09-23 19:23:48.570428 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:23:48.570447 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:23:48.570465 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:23:48.570483 | orchestrator | 2025-09-23 19:23:48.570500 | orchestrator | TASK [keystone : Copying files for keystone-fernet] **************************** 2025-09-23 19:23:48.570519 | orchestrator | Tuesday 23 September 2025 19:23:32 +0000 (0:00:00.309) 0:00:31.997 ***** 2025-09-23 19:23:48.570539 | orchestrator | changed: [testbed-node-0] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2025-09-23 19:23:48.570558 | orchestrator | changed: [testbed-node-1] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2025-09-23 19:23:48.570578 | orchestrator | changed: [testbed-node-2] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2025-09-23 19:23:48.570597 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2025-09-23 19:23:48.570617 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2025-09-23 19:23:48.570648 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2025-09-23 19:23:48.570668 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2025-09-23 19:23:48.570687 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2025-09-23 19:23:48.570707 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2025-09-23 19:23:48.570727 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2025-09-23 19:23:48.570747 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2025-09-23 19:23:48.570766 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2025-09-23 19:23:48.570785 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2025-09-23 19:23:48.570803 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2025-09-23 19:23:48.570821 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2025-09-23 19:23:48.570840 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2025-09-23 19:23:48.570859 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2025-09-23 19:23:48.570878 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2025-09-23 19:23:48.570896 | orchestrator | changed: [testbed-node-2] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2025-09-23 19:23:48.570915 | orchestrator | changed: [testbed-node-0] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2025-09-23 19:23:48.570935 | orchestrator | changed: [testbed-node-1] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2025-09-23 19:23:48.570953 | orchestrator | 2025-09-23 19:23:48.570973 | orchestrator | TASK [keystone : Copying files for keystone-ssh] ******************************* 2025-09-23 19:23:48.570992 | orchestrator | Tuesday 23 September 2025 19:23:40 +0000 (0:00:08.642) 0:00:40.639 ***** 2025-09-23 19:23:48.571012 | orchestrator | changed: [testbed-node-0] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2025-09-23 19:23:48.571031 | orchestrator | changed: [testbed-node-1] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2025-09-23 19:23:48.571132 | orchestrator | changed: [testbed-node-2] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2025-09-23 19:23:48.571157 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2025-09-23 19:23:48.571174 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2025-09-23 19:23:48.571193 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2025-09-23 19:23:48.571211 | orchestrator | 2025-09-23 19:23:48.571230 | orchestrator | TASK [keystone : Check keystone containers] ************************************ 2025-09-23 19:23:48.571249 | orchestrator | Tuesday 23 September 2025 19:23:43 +0000 (0:00:02.934) 0:00:43.573 ***** 2025-09-23 19:23:48.571285 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.571317 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.571339 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-23 19:23:48.571360 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.571392 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.571412 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-23 19:23:48.571436 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.571459 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.571477 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-23 19:23:48.571495 | orchestrator | 2025-09-23 19:23:48.571510 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-23 19:23:48.571526 | orchestrator | Tuesday 23 September 2025 19:23:46 +0000 (0:00:02.309) 0:00:45.883 ***** 2025-09-23 19:23:48.571542 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:23:48.571558 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:23:48.571584 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:23:48.571600 | orchestrator | 2025-09-23 19:23:48.571616 | orchestrator | TASK [keystone : Creating keystone database] *********************************** 2025-09-23 19:23:48.571632 | orchestrator | Tuesday 23 September 2025 19:23:46 +0000 (0:00:00.284) 0:00:46.168 ***** 2025-09-23 19:23:48.571647 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:23:48.571662 | orchestrator | 2025-09-23 19:23:48.571678 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:23:48.571696 | orchestrator | testbed-node-0 : ok=20  changed=10  unreachable=0 failed=1  skipped=8  rescued=0 ignored=0 2025-09-23 19:23:48.571714 | orchestrator | testbed-node-1 : ok=17  changed=10  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-09-23 19:23:48.571731 | orchestrator | testbed-node-2 : ok=17  changed=10  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-09-23 19:23:48.571746 | orchestrator | 2025-09-23 19:23:48.571763 | orchestrator | 2025-09-23 19:23:48.571779 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:23:48.571796 | orchestrator | Tuesday 23 September 2025 19:23:47 +0000 (0:00:00.753) 0:00:46.922 ***** 2025-09-23 19:23:48.571814 | orchestrator | =============================================================================== 2025-09-23 19:23:48.571836 | orchestrator | keystone : Copying files for keystone-fernet ---------------------------- 8.64s 2025-09-23 19:23:48.571856 | orchestrator | keystone : Copying over keystone.conf ----------------------------------- 6.29s 2025-09-23 19:23:48.571876 | orchestrator | keystone : Copying over config.json files for services ------------------ 3.36s 2025-09-23 19:23:48.571895 | orchestrator | service-cert-copy : keystone | Copying over extra CA certificates ------- 3.10s 2025-09-23 19:23:48.571915 | orchestrator | keystone : Copying files for keystone-ssh ------------------------------- 2.93s 2025-09-23 19:23:48.571934 | orchestrator | keystone : Copying over existing policy file ---------------------------- 2.38s 2025-09-23 19:23:48.571954 | orchestrator | keystone : Check keystone containers ------------------------------------ 2.31s 2025-09-23 19:23:48.571974 | orchestrator | keystone : Ensuring config directories exist ---------------------------- 1.95s 2025-09-23 19:23:48.571993 | orchestrator | keystone : Copying over wsgi-keystone.conf ------------------------------ 1.93s 2025-09-23 19:23:48.572013 | orchestrator | keystone : Copying keystone-startup script for keystone ----------------- 1.42s 2025-09-23 19:23:48.572033 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS certificate --- 1.16s 2025-09-23 19:23:48.572131 | orchestrator | keystone : Generate the required cron jobs for the node ----------------- 1.10s 2025-09-23 19:23:48.572172 | orchestrator | keystone : Checking whether keystone-paste.ini file exists -------------- 0.91s 2025-09-23 19:23:48.572192 | orchestrator | keystone : Copying over keystone-paste.ini ------------------------------ 0.88s 2025-09-23 19:23:48.572211 | orchestrator | keystone : Check if policies shall be overwritten ----------------------- 0.87s 2025-09-23 19:23:48.572230 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS key ---- 0.86s 2025-09-23 19:23:48.572251 | orchestrator | keystone : Creating keystone database ----------------------------------- 0.75s 2025-09-23 19:23:48.572270 | orchestrator | keystone : Create Keystone domain-specific config directory ------------- 0.70s 2025-09-23 19:23:48.572289 | orchestrator | keystone : Check if Keystone domain-specific config is supplied --------- 0.67s 2025-09-23 19:23:48.572309 | orchestrator | keystone : Copying Keystone Domain specific settings -------------------- 0.58s 2025-09-23 19:23:48.572325 | orchestrator | 2025-09-23 19:23:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:51.593384 | orchestrator | 2025-09-23 19:23:51 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:23:51.593720 | orchestrator | 2025-09-23 19:23:51 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:23:51.594413 | orchestrator | 2025-09-23 19:23:51 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:51.595106 | orchestrator | 2025-09-23 19:23:51 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:23:51.595818 | orchestrator | 2025-09-23 19:23:51 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:23:51.598170 | orchestrator | 2025-09-23 19:23:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:54.634825 | orchestrator | 2025-09-23 19:23:54 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:23:54.637097 | orchestrator | 2025-09-23 19:23:54 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:23:54.640495 | orchestrator | 2025-09-23 19:23:54 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:54.642469 | orchestrator | 2025-09-23 19:23:54 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:23:54.644214 | orchestrator | 2025-09-23 19:23:54 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:23:54.644238 | orchestrator | 2025-09-23 19:23:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:23:57.686537 | orchestrator | 2025-09-23 19:23:57 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:23:57.689131 | orchestrator | 2025-09-23 19:23:57 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:23:57.691552 | orchestrator | 2025-09-23 19:23:57 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:23:57.693492 | orchestrator | 2025-09-23 19:23:57 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:23:57.697348 | orchestrator | 2025-09-23 19:23:57 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:23:57.697382 | orchestrator | 2025-09-23 19:23:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:00.740409 | orchestrator | 2025-09-23 19:24:00 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:00.741766 | orchestrator | 2025-09-23 19:24:00 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:00.743389 | orchestrator | 2025-09-23 19:24:00 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:00.745169 | orchestrator | 2025-09-23 19:24:00 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:00.746873 | orchestrator | 2025-09-23 19:24:00 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:00.746896 | orchestrator | 2025-09-23 19:24:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:03.788512 | orchestrator | 2025-09-23 19:24:03 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:03.790714 | orchestrator | 2025-09-23 19:24:03 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:03.792310 | orchestrator | 2025-09-23 19:24:03 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:03.793594 | orchestrator | 2025-09-23 19:24:03 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:03.795154 | orchestrator | 2025-09-23 19:24:03 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:03.795206 | orchestrator | 2025-09-23 19:24:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:06.838829 | orchestrator | 2025-09-23 19:24:06 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:06.840767 | orchestrator | 2025-09-23 19:24:06 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:06.842850 | orchestrator | 2025-09-23 19:24:06 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:06.844306 | orchestrator | 2025-09-23 19:24:06 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:06.846223 | orchestrator | 2025-09-23 19:24:06 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:06.846278 | orchestrator | 2025-09-23 19:24:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:09.893639 | orchestrator | 2025-09-23 19:24:09 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:09.895400 | orchestrator | 2025-09-23 19:24:09 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:09.898243 | orchestrator | 2025-09-23 19:24:09 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:09.900860 | orchestrator | 2025-09-23 19:24:09 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:09.903399 | orchestrator | 2025-09-23 19:24:09 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:09.903662 | orchestrator | 2025-09-23 19:24:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:12.943824 | orchestrator | 2025-09-23 19:24:12 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:12.945323 | orchestrator | 2025-09-23 19:24:12 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:12.947137 | orchestrator | 2025-09-23 19:24:12 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:12.949040 | orchestrator | 2025-09-23 19:24:12 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:12.950323 | orchestrator | 2025-09-23 19:24:12 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:12.950376 | orchestrator | 2025-09-23 19:24:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:15.995877 | orchestrator | 2025-09-23 19:24:15 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:15.997926 | orchestrator | 2025-09-23 19:24:15 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:16.000236 | orchestrator | 2025-09-23 19:24:16 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:16.005899 | orchestrator | 2025-09-23 19:24:16 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:16.006890 | orchestrator | 2025-09-23 19:24:16 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:16.007813 | orchestrator | 2025-09-23 19:24:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:19.052200 | orchestrator | 2025-09-23 19:24:19 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:19.054502 | orchestrator | 2025-09-23 19:24:19 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:19.056301 | orchestrator | 2025-09-23 19:24:19 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:19.058101 | orchestrator | 2025-09-23 19:24:19 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:19.059990 | orchestrator | 2025-09-23 19:24:19 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:19.060080 | orchestrator | 2025-09-23 19:24:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:22.100408 | orchestrator | 2025-09-23 19:24:22 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:22.102254 | orchestrator | 2025-09-23 19:24:22 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:22.104663 | orchestrator | 2025-09-23 19:24:22 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:22.107252 | orchestrator | 2025-09-23 19:24:22 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:22.109291 | orchestrator | 2025-09-23 19:24:22 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:22.109396 | orchestrator | 2025-09-23 19:24:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:25.150232 | orchestrator | 2025-09-23 19:24:25 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:25.151801 | orchestrator | 2025-09-23 19:24:25 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:25.153970 | orchestrator | 2025-09-23 19:24:25 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:25.156017 | orchestrator | 2025-09-23 19:24:25 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:25.157941 | orchestrator | 2025-09-23 19:24:25 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:25.157978 | orchestrator | 2025-09-23 19:24:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:28.196622 | orchestrator | 2025-09-23 19:24:28 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:28.198089 | orchestrator | 2025-09-23 19:24:28 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:28.200137 | orchestrator | 2025-09-23 19:24:28 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:28.202001 | orchestrator | 2025-09-23 19:24:28 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:28.203434 | orchestrator | 2025-09-23 19:24:28 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:28.203994 | orchestrator | 2025-09-23 19:24:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:31.245822 | orchestrator | 2025-09-23 19:24:31 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:31.247793 | orchestrator | 2025-09-23 19:24:31 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:31.249562 | orchestrator | 2025-09-23 19:24:31 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:31.252142 | orchestrator | 2025-09-23 19:24:31 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:31.254494 | orchestrator | 2025-09-23 19:24:31 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:31.254549 | orchestrator | 2025-09-23 19:24:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:34.286547 | orchestrator | 2025-09-23 19:24:34 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:34.288277 | orchestrator | 2025-09-23 19:24:34 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:34.290355 | orchestrator | 2025-09-23 19:24:34 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:34.292208 | orchestrator | 2025-09-23 19:24:34 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:34.294000 | orchestrator | 2025-09-23 19:24:34 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:34.294167 | orchestrator | 2025-09-23 19:24:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:37.341872 | orchestrator | 2025-09-23 19:24:37 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:37.343665 | orchestrator | 2025-09-23 19:24:37 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:37.345354 | orchestrator | 2025-09-23 19:24:37 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:37.347658 | orchestrator | 2025-09-23 19:24:37 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:37.349242 | orchestrator | 2025-09-23 19:24:37 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:37.349271 | orchestrator | 2025-09-23 19:24:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:40.397910 | orchestrator | 2025-09-23 19:24:40 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:40.399663 | orchestrator | 2025-09-23 19:24:40 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:40.401090 | orchestrator | 2025-09-23 19:24:40 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:40.402686 | orchestrator | 2025-09-23 19:24:40 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:40.404131 | orchestrator | 2025-09-23 19:24:40 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:40.404165 | orchestrator | 2025-09-23 19:24:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:43.456776 | orchestrator | 2025-09-23 19:24:43 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state STARTED 2025-09-23 19:24:43.458462 | orchestrator | 2025-09-23 19:24:43 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:43.460589 | orchestrator | 2025-09-23 19:24:43 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:43.462650 | orchestrator | 2025-09-23 19:24:43 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:43.464251 | orchestrator | 2025-09-23 19:24:43 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state STARTED 2025-09-23 19:24:43.464297 | orchestrator | 2025-09-23 19:24:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:46.505299 | orchestrator | 2025-09-23 19:24:46 | INFO  | Task fd0e8f02-e5bd-4b50-993f-e858937980c3 is in state SUCCESS 2025-09-23 19:24:46.505785 | orchestrator | 2025-09-23 19:24:46 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:46.506802 | orchestrator | 2025-09-23 19:24:46 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:46.507791 | orchestrator | 2025-09-23 19:24:46 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:46.508468 | orchestrator | 2025-09-23 19:24:46 | INFO  | Task 0b889cc3-6297-49cc-a378-c0d599c52bbc is in state SUCCESS 2025-09-23 19:24:46.508761 | orchestrator | 2025-09-23 19:24:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:49.557628 | orchestrator | 2025-09-23 19:24:49 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:49.559717 | orchestrator | 2025-09-23 19:24:49 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:24:49.561915 | orchestrator | 2025-09-23 19:24:49 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:49.563188 | orchestrator | 2025-09-23 19:24:49 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state STARTED 2025-09-23 19:24:49.564871 | orchestrator | 2025-09-23 19:24:49 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:24:49.564956 | orchestrator | 2025-09-23 19:24:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:52.608462 | orchestrator | 2025-09-23 19:24:52 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:52.609777 | orchestrator | 2025-09-23 19:24:52 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:24:52.611512 | orchestrator | 2025-09-23 19:24:52 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:52.614159 | orchestrator | 2025-09-23 19:24:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:24:52.616556 | orchestrator | 2025-09-23 19:24:52 | INFO  | Task 80f57dbf-3f4e-45c5-af79-c36f52c33c11 is in state SUCCESS 2025-09-23 19:24:52.617073 | orchestrator | 2025-09-23 19:24:52.617102 | orchestrator | 2025-09-23 19:24:52.617115 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:24:52.617127 | orchestrator | 2025-09-23 19:24:52.617138 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:24:52.617151 | orchestrator | Tuesday 23 September 2025 19:23:51 +0000 (0:00:00.194) 0:00:00.194 ***** 2025-09-23 19:24:52.617163 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:24:52.617175 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:24:52.617187 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:24:52.617198 | orchestrator | 2025-09-23 19:24:52.617211 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:24:52.617223 | orchestrator | Tuesday 23 September 2025 19:23:51 +0000 (0:00:00.265) 0:00:00.459 ***** 2025-09-23 19:24:52.617235 | orchestrator | ok: [testbed-node-0] => (item=enable_designate_True) 2025-09-23 19:24:52.617247 | orchestrator | ok: [testbed-node-1] => (item=enable_designate_True) 2025-09-23 19:24:52.617259 | orchestrator | ok: [testbed-node-2] => (item=enable_designate_True) 2025-09-23 19:24:52.617271 | orchestrator | 2025-09-23 19:24:52.617282 | orchestrator | PLAY [Apply role designate] **************************************************** 2025-09-23 19:24:52.617294 | orchestrator | 2025-09-23 19:24:52.617306 | orchestrator | TASK [designate : include_tasks] *********************************************** 2025-09-23 19:24:52.617318 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.344) 0:00:00.803 ***** 2025-09-23 19:24:52.617330 | orchestrator | included: /ansible/roles/designate/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:24:52.617342 | orchestrator | 2025-09-23 19:24:52.617353 | orchestrator | TASK [service-ks-register : designate | Creating services] ********************* 2025-09-23 19:24:52.617365 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.448) 0:00:01.251 ***** 2025-09-23 19:24:52.617377 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (5 retries left). 2025-09-23 19:24:52.617389 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (4 retries left). 2025-09-23 19:24:52.617400 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (3 retries left). 2025-09-23 19:24:52.617412 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (2 retries left). 2025-09-23 19:24:52.617424 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (1 retries left). 2025-09-23 19:24:52.617473 | orchestrator | failed: [testbed-node-0] (item=designate (dns)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Designate DNS Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9001"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9001"}], "name": "designate", "type": "dns"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:24:52.617489 | orchestrator | 2025-09-23 19:24:52.617500 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:24:52.617511 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:24:52.617524 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:24:52.617536 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:24:52.617548 | orchestrator | 2025-09-23 19:24:52.617560 | orchestrator | 2025-09-23 19:24:52.617572 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:24:52.617583 | orchestrator | Tuesday 23 September 2025 19:24:46 +0000 (0:00:53.629) 0:00:54.880 ***** 2025-09-23 19:24:52.617595 | orchestrator | =============================================================================== 2025-09-23 19:24:52.617607 | orchestrator | service-ks-register : designate | Creating services -------------------- 53.63s 2025-09-23 19:24:52.617619 | orchestrator | designate : include_tasks ----------------------------------------------- 0.45s 2025-09-23 19:24:52.617630 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.34s 2025-09-23 19:24:52.617642 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.27s 2025-09-23 19:24:52.617654 | orchestrator | 2025-09-23 19:24:52.617666 | orchestrator | 2025-09-23 19:24:52.617679 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:24:52.617692 | orchestrator | 2025-09-23 19:24:52.617705 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:24:52.617718 | orchestrator | Tuesday 23 September 2025 19:23:51 +0000 (0:00:00.249) 0:00:00.249 ***** 2025-09-23 19:24:52.617732 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:24:52.617745 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:24:52.617758 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:24:52.617771 | orchestrator | 2025-09-23 19:24:52.617784 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:24:52.617798 | orchestrator | Tuesday 23 September 2025 19:23:51 +0000 (0:00:00.345) 0:00:00.594 ***** 2025-09-23 19:24:52.617811 | orchestrator | ok: [testbed-node-0] => (item=enable_barbican_True) 2025-09-23 19:24:52.617825 | orchestrator | ok: [testbed-node-1] => (item=enable_barbican_True) 2025-09-23 19:24:52.617839 | orchestrator | ok: [testbed-node-2] => (item=enable_barbican_True) 2025-09-23 19:24:52.617852 | orchestrator | 2025-09-23 19:24:52.617866 | orchestrator | PLAY [Apply role barbican] ***************************************************** 2025-09-23 19:24:52.617879 | orchestrator | 2025-09-23 19:24:52.617893 | orchestrator | TASK [barbican : include_tasks] ************************************************ 2025-09-23 19:24:52.617917 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.331) 0:00:00.926 ***** 2025-09-23 19:24:52.617931 | orchestrator | included: /ansible/roles/barbican/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:24:52.617942 | orchestrator | 2025-09-23 19:24:52.617955 | orchestrator | TASK [service-ks-register : barbican | Creating services] ********************** 2025-09-23 19:24:52.617968 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.419) 0:00:01.345 ***** 2025-09-23 19:24:52.617981 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (5 retries left). 2025-09-23 19:24:52.617994 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (4 retries left). 2025-09-23 19:24:52.618071 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (3 retries left). 2025-09-23 19:24:52.618087 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (2 retries left). 2025-09-23 19:24:52.618098 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (1 retries left). 2025-09-23 19:24:52.618111 | orchestrator | failed: [testbed-node-0] (item=barbican (key-manager)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Barbican Key Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9311"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9311"}], "name": "barbican", "type": "key-manager"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:24:52.618124 | orchestrator | 2025-09-23 19:24:52.618136 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:24:52.618147 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:24:52.618159 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:24:52.618170 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:24:52.618182 | orchestrator | 2025-09-23 19:24:52.618193 | orchestrator | 2025-09-23 19:24:52.618205 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:24:52.618221 | orchestrator | Tuesday 23 September 2025 19:24:45 +0000 (0:00:53.291) 0:00:54.637 ***** 2025-09-23 19:24:52.618233 | orchestrator | =============================================================================== 2025-09-23 19:24:52.618245 | orchestrator | service-ks-register : barbican | Creating services --------------------- 53.29s 2025-09-23 19:24:52.618256 | orchestrator | barbican : include_tasks ------------------------------------------------ 0.42s 2025-09-23 19:24:52.618267 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.35s 2025-09-23 19:24:52.618278 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.33s 2025-09-23 19:24:52.618289 | orchestrator | 2025-09-23 19:24:52.618300 | orchestrator | 2025-09-23 19:24:52.618311 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:24:52.618322 | orchestrator | 2025-09-23 19:24:52.618333 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:24:52.618344 | orchestrator | Tuesday 23 September 2025 19:23:51 +0000 (0:00:00.238) 0:00:00.238 ***** 2025-09-23 19:24:52.618355 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:24:52.618366 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:24:52.618377 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:24:52.618389 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:24:52.618400 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:24:52.618411 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:24:52.618422 | orchestrator | 2025-09-23 19:24:52.618433 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:24:52.618444 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.612) 0:00:00.851 ***** 2025-09-23 19:24:52.618456 | orchestrator | ok: [testbed-node-0] => (item=enable_neutron_True) 2025-09-23 19:24:52.618467 | orchestrator | ok: [testbed-node-1] => (item=enable_neutron_True) 2025-09-23 19:24:52.618478 | orchestrator | ok: [testbed-node-2] => (item=enable_neutron_True) 2025-09-23 19:24:52.618582 | orchestrator | ok: [testbed-node-3] => (item=enable_neutron_True) 2025-09-23 19:24:52.618598 | orchestrator | ok: [testbed-node-4] => (item=enable_neutron_True) 2025-09-23 19:24:52.618609 | orchestrator | ok: [testbed-node-5] => (item=enable_neutron_True) 2025-09-23 19:24:52.618620 | orchestrator | 2025-09-23 19:24:52.618631 | orchestrator | PLAY [Apply role neutron] ****************************************************** 2025-09-23 19:24:52.618641 | orchestrator | 2025-09-23 19:24:52.618652 | orchestrator | TASK [neutron : include_tasks] ************************************************* 2025-09-23 19:24:52.618674 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.623) 0:00:01.475 ***** 2025-09-23 19:24:52.618685 | orchestrator | included: /ansible/roles/neutron/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:24:52.618696 | orchestrator | 2025-09-23 19:24:52.618706 | orchestrator | TASK [neutron : Get container facts] ******************************************* 2025-09-23 19:24:52.618717 | orchestrator | Tuesday 23 September 2025 19:23:53 +0000 (0:00:00.995) 0:00:02.470 ***** 2025-09-23 19:24:52.618728 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:24:52.618739 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:24:52.618749 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:24:52.618760 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:24:52.618771 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:24:52.618782 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:24:52.618792 | orchestrator | 2025-09-23 19:24:52.618810 | orchestrator | TASK [neutron : Get container volume facts] ************************************ 2025-09-23 19:24:52.618821 | orchestrator | Tuesday 23 September 2025 19:23:54 +0000 (0:00:01.091) 0:00:03.562 ***** 2025-09-23 19:24:52.618832 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:24:52.618916 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:24:52.618929 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:24:52.618940 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:24:52.618950 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:24:52.618961 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:24:52.618972 | orchestrator | 2025-09-23 19:24:52.618983 | orchestrator | TASK [neutron : Check for ML2/OVN presence] ************************************ 2025-09-23 19:24:52.618994 | orchestrator | Tuesday 23 September 2025 19:23:55 +0000 (0:00:01.027) 0:00:04.589 ***** 2025-09-23 19:24:52.619005 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:24:52.619016 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:24:52.619054 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:24:52.619065 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:24:52.619076 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:24:52.619086 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:24:52.619097 | orchestrator | 2025-09-23 19:24:52.619108 | orchestrator | TASK [neutron : Check for ML2/OVS presence] ************************************ 2025-09-23 19:24:52.619118 | orchestrator | Tuesday 23 September 2025 19:23:56 +0000 (0:00:00.606) 0:00:05.196 ***** 2025-09-23 19:24:52.619129 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:24:52.619140 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:24:52.619151 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:24:52.619161 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:24:52.619172 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:24:52.619182 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:24:52.619193 | orchestrator | 2025-09-23 19:24:52.619204 | orchestrator | TASK [service-ks-register : neutron | Creating services] *********************** 2025-09-23 19:24:52.619215 | orchestrator | Tuesday 23 September 2025 19:23:56 +0000 (0:00:00.518) 0:00:05.714 ***** 2025-09-23 19:24:52.619226 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (5 retries left). 2025-09-23 19:24:52.619237 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (4 retries left). 2025-09-23 19:24:52.619247 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (3 retries left). 2025-09-23 19:24:52.619258 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (2 retries left). 2025-09-23 19:24:52.619269 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (1 retries left). 2025-09-23 19:24:52.619286 | orchestrator | failed: [testbed-node-0] (item=neutron (network)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Openstack Networking", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9696"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9696"}], "name": "neutron", "type": "network"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:24:52.619306 | orchestrator | 2025-09-23 19:24:52.619317 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:24:52.619328 | orchestrator | testbed-node-0 : ok=5  changed=0 unreachable=0 failed=1  skipped=2  rescued=0 ignored=0 2025-09-23 19:24:52.619339 | orchestrator | testbed-node-1 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:24:52.619350 | orchestrator | testbed-node-2 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:24:52.619361 | orchestrator | testbed-node-3 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:24:52.619372 | orchestrator | testbed-node-4 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:24:52.619382 | orchestrator | testbed-node-5 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:24:52.619393 | orchestrator | 2025-09-23 19:24:52.619404 | orchestrator | 2025-09-23 19:24:52.619415 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:24:52.619426 | orchestrator | Tuesday 23 September 2025 19:24:50 +0000 (0:00:53.203) 0:00:58.917 ***** 2025-09-23 19:24:52.619437 | orchestrator | =============================================================================== 2025-09-23 19:24:52.619448 | orchestrator | service-ks-register : neutron | Creating services ---------------------- 53.20s 2025-09-23 19:24:52.619459 | orchestrator | neutron : Get container facts ------------------------------------------- 1.09s 2025-09-23 19:24:52.619469 | orchestrator | neutron : Get container volume facts ------------------------------------ 1.03s 2025-09-23 19:24:52.619480 | orchestrator | neutron : include_tasks ------------------------------------------------- 1.00s 2025-09-23 19:24:52.619491 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.62s 2025-09-23 19:24:52.619501 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.61s 2025-09-23 19:24:52.619512 | orchestrator | neutron : Check for ML2/OVN presence ------------------------------------ 0.61s 2025-09-23 19:24:52.619523 | orchestrator | neutron : Check for ML2/OVS presence ------------------------------------ 0.52s 2025-09-23 19:24:52.619541 | orchestrator | 2025-09-23 19:24:52 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:24:52.619555 | orchestrator | 2025-09-23 19:24:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:55.651433 | orchestrator | 2025-09-23 19:24:55 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:55.653559 | orchestrator | 2025-09-23 19:24:55 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:24:55.655616 | orchestrator | 2025-09-23 19:24:55 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:55.658545 | orchestrator | 2025-09-23 19:24:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:24:55.664518 | orchestrator | 2025-09-23 19:24:55 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:24:55.664541 | orchestrator | 2025-09-23 19:24:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:24:58.702837 | orchestrator | 2025-09-23 19:24:58 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:24:58.703178 | orchestrator | 2025-09-23 19:24:58 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:24:58.704321 | orchestrator | 2025-09-23 19:24:58 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:24:58.705071 | orchestrator | 2025-09-23 19:24:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:24:58.705832 | orchestrator | 2025-09-23 19:24:58 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:24:58.705853 | orchestrator | 2025-09-23 19:24:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:01.745328 | orchestrator | 2025-09-23 19:25:01 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:01.746734 | orchestrator | 2025-09-23 19:25:01 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:01.748345 | orchestrator | 2025-09-23 19:25:01 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:25:01.749474 | orchestrator | 2025-09-23 19:25:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:01.750851 | orchestrator | 2025-09-23 19:25:01 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:01.750882 | orchestrator | 2025-09-23 19:25:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:04.794283 | orchestrator | 2025-09-23 19:25:04 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:04.797691 | orchestrator | 2025-09-23 19:25:04 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:04.799500 | orchestrator | 2025-09-23 19:25:04 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:25:04.800534 | orchestrator | 2025-09-23 19:25:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:04.801869 | orchestrator | 2025-09-23 19:25:04 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:04.801905 | orchestrator | 2025-09-23 19:25:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:07.842278 | orchestrator | 2025-09-23 19:25:07 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:07.843572 | orchestrator | 2025-09-23 19:25:07 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:07.844905 | orchestrator | 2025-09-23 19:25:07 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:25:07.846430 | orchestrator | 2025-09-23 19:25:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:07.847685 | orchestrator | 2025-09-23 19:25:07 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:07.848038 | orchestrator | 2025-09-23 19:25:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:10.885459 | orchestrator | 2025-09-23 19:25:10 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:10.887251 | orchestrator | 2025-09-23 19:25:10 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:10.889347 | orchestrator | 2025-09-23 19:25:10 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state STARTED 2025-09-23 19:25:10.891538 | orchestrator | 2025-09-23 19:25:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:10.892449 | orchestrator | 2025-09-23 19:25:10 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:10.892550 | orchestrator | 2025-09-23 19:25:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:13.939945 | orchestrator | 2025-09-23 19:25:13 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:13.942364 | orchestrator | 2025-09-23 19:25:13 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:13.947985 | orchestrator | 2025-09-23 19:25:13 | INFO  | Task c3a04eb4-a25a-4626-8997-2e5a6a3cf357 is in state SUCCESS 2025-09-23 19:25:13.949994 | orchestrator | 2025-09-23 19:25:13.950113 | orchestrator | 2025-09-23 19:25:13.950128 | orchestrator | PLAY [Create ceph pools] ******************************************************* 2025-09-23 19:25:13.950140 | orchestrator | 2025-09-23 19:25:13.950151 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2025-09-23 19:25:13.950162 | orchestrator | Tuesday 23 September 2025 19:23:02 +0000 (0:00:00.590) 0:00:00.590 ***** 2025-09-23 19:25:13.950249 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:25:13.950263 | orchestrator | 2025-09-23 19:25:13.950274 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2025-09-23 19:25:13.950285 | orchestrator | Tuesday 23 September 2025 19:23:03 +0000 (0:00:00.622) 0:00:01.213 ***** 2025-09-23 19:25:13.950296 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.950307 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.950318 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.950329 | orchestrator | 2025-09-23 19:25:13.950339 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2025-09-23 19:25:13.950350 | orchestrator | Tuesday 23 September 2025 19:23:03 +0000 (0:00:00.684) 0:00:01.898 ***** 2025-09-23 19:25:13.950361 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.950371 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.950382 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.950393 | orchestrator | 2025-09-23 19:25:13.950403 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2025-09-23 19:25:13.950414 | orchestrator | Tuesday 23 September 2025 19:23:04 +0000 (0:00:00.305) 0:00:02.203 ***** 2025-09-23 19:25:13.950425 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.950435 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.950446 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.950456 | orchestrator | 2025-09-23 19:25:13.950467 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2025-09-23 19:25:13.950492 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.747) 0:00:02.950 ***** 2025-09-23 19:25:13.950503 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.950514 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.950525 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.950538 | orchestrator | 2025-09-23 19:25:13.950557 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2025-09-23 19:25:13.950576 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.308) 0:00:03.259 ***** 2025-09-23 19:25:13.950595 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.950615 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.950635 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.951409 | orchestrator | 2025-09-23 19:25:13.951441 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2025-09-23 19:25:13.951459 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.311) 0:00:03.570 ***** 2025-09-23 19:25:13.951476 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.951487 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.951500 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.951519 | orchestrator | 2025-09-23 19:25:13.951539 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2025-09-23 19:25:13.951556 | orchestrator | Tuesday 23 September 2025 19:23:05 +0000 (0:00:00.309) 0:00:03.879 ***** 2025-09-23 19:25:13.951575 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.951593 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.951611 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.951628 | orchestrator | 2025-09-23 19:25:13.951749 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2025-09-23 19:25:13.951786 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.477) 0:00:04.357 ***** 2025-09-23 19:25:13.951798 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.951808 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.951819 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.951830 | orchestrator | 2025-09-23 19:25:13.951841 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2025-09-23 19:25:13.951851 | orchestrator | Tuesday 23 September 2025 19:23:06 +0000 (0:00:00.282) 0:00:04.640 ***** 2025-09-23 19:25:13.951862 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:25:13.951873 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:25:13.951884 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:25:13.951894 | orchestrator | 2025-09-23 19:25:13.951905 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2025-09-23 19:25:13.951915 | orchestrator | Tuesday 23 September 2025 19:23:07 +0000 (0:00:00.633) 0:00:05.273 ***** 2025-09-23 19:25:13.951926 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.951937 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.951947 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.951958 | orchestrator | 2025-09-23 19:25:13.951968 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2025-09-23 19:25:13.951979 | orchestrator | Tuesday 23 September 2025 19:23:07 +0000 (0:00:00.425) 0:00:05.698 ***** 2025-09-23 19:25:13.951990 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:25:13.952000 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:25:13.952036 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:25:13.952051 | orchestrator | 2025-09-23 19:25:13.952062 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2025-09-23 19:25:13.952072 | orchestrator | Tuesday 23 September 2025 19:23:10 +0000 (0:00:02.254) 0:00:07.952 ***** 2025-09-23 19:25:13.952083 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-23 19:25:13.952094 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-23 19:25:13.952104 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-23 19:25:13.952115 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.952125 | orchestrator | 2025-09-23 19:25:13.952136 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2025-09-23 19:25:13.952194 | orchestrator | Tuesday 23 September 2025 19:23:10 +0000 (0:00:00.462) 0:00:08.415 ***** 2025-09-23 19:25:13.952209 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.952222 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.952233 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.952244 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.952255 | orchestrator | 2025-09-23 19:25:13.952266 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2025-09-23 19:25:13.952277 | orchestrator | Tuesday 23 September 2025 19:23:11 +0000 (0:00:00.836) 0:00:09.251 ***** 2025-09-23 19:25:13.952299 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.952332 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.952354 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.952374 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.952393 | orchestrator | 2025-09-23 19:25:13.952413 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2025-09-23 19:25:13.952432 | orchestrator | Tuesday 23 September 2025 19:23:11 +0000 (0:00:00.150) 0:00:09.402 ***** 2025-09-23 19:25:13.952454 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '99b0dd2d866d', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2025-09-23 19:23:08.549840', 'end': '2025-09-23 19:23:08.594775', 'delta': '0:00:00.044935', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['99b0dd2d866d'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}) 2025-09-23 19:25:13.952479 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': 'a5e7c1593c11', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2025-09-23 19:23:09.332683', 'end': '2025-09-23 19:23:09.375187', 'delta': '0:00:00.042504', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['a5e7c1593c11'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}) 2025-09-23 19:25:13.952556 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '3730f0e11c57', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2025-09-23 19:23:09.864185', 'end': '2025-09-23 19:23:09.898993', 'delta': '0:00:00.034808', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['3730f0e11c57'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}) 2025-09-23 19:25:13.952581 | orchestrator | 2025-09-23 19:25:13.952602 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2025-09-23 19:25:13.952621 | orchestrator | Tuesday 23 September 2025 19:23:11 +0000 (0:00:00.361) 0:00:09.763 ***** 2025-09-23 19:25:13.952655 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.952674 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.952693 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.952711 | orchestrator | 2025-09-23 19:25:13.952730 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2025-09-23 19:25:13.952748 | orchestrator | Tuesday 23 September 2025 19:23:12 +0000 (0:00:00.512) 0:00:10.276 ***** 2025-09-23 19:25:13.952768 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] 2025-09-23 19:25:13.952787 | orchestrator | 2025-09-23 19:25:13.952805 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2025-09-23 19:25:13.952824 | orchestrator | Tuesday 23 September 2025 19:23:14 +0000 (0:00:01.768) 0:00:12.044 ***** 2025-09-23 19:25:13.952842 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.952860 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.952880 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.952899 | orchestrator | 2025-09-23 19:25:13.952918 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2025-09-23 19:25:13.952938 | orchestrator | Tuesday 23 September 2025 19:23:14 +0000 (0:00:00.292) 0:00:12.336 ***** 2025-09-23 19:25:13.952956 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.952972 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.952983 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.952993 | orchestrator | 2025-09-23 19:25:13.953004 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-23 19:25:13.953068 | orchestrator | Tuesday 23 September 2025 19:23:14 +0000 (0:00:00.422) 0:00:12.759 ***** 2025-09-23 19:25:13.953183 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953199 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953210 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953221 | orchestrator | 2025-09-23 19:25:13.953231 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2025-09-23 19:25:13.953242 | orchestrator | Tuesday 23 September 2025 19:23:15 +0000 (0:00:00.513) 0:00:13.273 ***** 2025-09-23 19:25:13.953253 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.953264 | orchestrator | 2025-09-23 19:25:13.953274 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2025-09-23 19:25:13.953283 | orchestrator | Tuesday 23 September 2025 19:23:15 +0000 (0:00:00.145) 0:00:13.419 ***** 2025-09-23 19:25:13.953292 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953302 | orchestrator | 2025-09-23 19:25:13.953311 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-23 19:25:13.953321 | orchestrator | Tuesday 23 September 2025 19:23:15 +0000 (0:00:00.211) 0:00:13.630 ***** 2025-09-23 19:25:13.953333 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953350 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953365 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953381 | orchestrator | 2025-09-23 19:25:13.953398 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2025-09-23 19:25:13.953415 | orchestrator | Tuesday 23 September 2025 19:23:16 +0000 (0:00:00.281) 0:00:13.912 ***** 2025-09-23 19:25:13.953432 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953448 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953458 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953467 | orchestrator | 2025-09-23 19:25:13.953477 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2025-09-23 19:25:13.953486 | orchestrator | Tuesday 23 September 2025 19:23:16 +0000 (0:00:00.334) 0:00:14.246 ***** 2025-09-23 19:25:13.953496 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953505 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953514 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953524 | orchestrator | 2025-09-23 19:25:13.953534 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2025-09-23 19:25:13.953551 | orchestrator | Tuesday 23 September 2025 19:23:16 +0000 (0:00:00.476) 0:00:14.722 ***** 2025-09-23 19:25:13.953580 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953598 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953616 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953632 | orchestrator | 2025-09-23 19:25:13.953645 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2025-09-23 19:25:13.953654 | orchestrator | Tuesday 23 September 2025 19:23:17 +0000 (0:00:00.336) 0:00:15.059 ***** 2025-09-23 19:25:13.953664 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953673 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953682 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953692 | orchestrator | 2025-09-23 19:25:13.953701 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2025-09-23 19:25:13.953710 | orchestrator | Tuesday 23 September 2025 19:23:17 +0000 (0:00:00.368) 0:00:15.428 ***** 2025-09-23 19:25:13.953720 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953729 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953739 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953748 | orchestrator | 2025-09-23 19:25:13.953758 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2025-09-23 19:25:13.953807 | orchestrator | Tuesday 23 September 2025 19:23:17 +0000 (0:00:00.358) 0:00:15.786 ***** 2025-09-23 19:25:13.953819 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.953830 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.953840 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.953851 | orchestrator | 2025-09-23 19:25:13.953889 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2025-09-23 19:25:13.953900 | orchestrator | Tuesday 23 September 2025 19:23:18 +0000 (0:00:00.503) 0:00:16.290 ***** 2025-09-23 19:25:13.953912 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e', 'dm-uuid-LVM-BYh1we6l1Rbny4mpPNGVfmFVqmlrDTdadBL2afMPVC7aYkeSl0VtWfEEDEItKBqD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.953929 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262', 'dm-uuid-LVM-NcdhDJBq0TBcw9ePnA00uXvA5tL30WE3Z4S8MCdepjendah0VppDSjGgz9nPXIRI'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.953942 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.953954 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.953966 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.953984 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.953995 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954104 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304', 'dm-uuid-LVM-nw8hRIb2eDpdk169y1rdcFUze1XfuOjJllJ9bGkQ0w0EH5YlPs5Idof0C67ssk46'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954121 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee', 'dm-uuid-LVM-XJfizJhV9UhWBv2FwGTBmsjdeRQx0bAnGRp8GEaT01hL7vlh46uUFrtFT6WiLhoZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954133 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954149 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954161 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954172 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954192 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954204 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954214 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954253 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954272 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954291 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954301 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954311 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-qwNbwq-ZWIw-gtu3-bkEl-T6U4-liMO-iGzhMR', 'scsi-0QEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5', 'scsi-SQEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954351 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954363 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2FuINB-zcer-4mIL-BOFU-w1dA-hCsm-AWvBtO', 'scsi-0QEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd', 'scsi-SQEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954383 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954401 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-A4Gl3a-uEFo-1YjV-onOt-lDti-Rblb-3dFZee', 'scsi-0QEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85', 'scsi-SQEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954418 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6', 'scsi-SQEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954429 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Ke1zo1-HE7e-DXga-aXdS-u4PO-3JOJ-cGNfpd', 'scsi-0QEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd', 'scsi-SQEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954443 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-43-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954454 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8', 'scsi-SQEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954470 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-40-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954481 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.954491 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.954501 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5', 'dm-uuid-LVM-cTPR2qR6Zc8oAkE17BbZLrodQs1QMSHCIyAIczA6d59xBSXvG9KA9cu5ghiYSaro'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954518 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42', 'dm-uuid-LVM-QxliPBJmTpLitQexep3vAZAasAjeKSby8Zpqm5RSUCw8quKD9lV8fEk8m3kUJSyu'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954528 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954539 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954553 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954563 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954579 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954589 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954599 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954608 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-23 19:25:13.954630 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954648 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-E2n78L-aS7J-rCCR-K0MN-C7Uz-Tc8Z-2fbQrV', 'scsi-0QEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b', 'scsi-SQEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954659 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zZMqce-nVYl-F3hw-V2eM-fbVa-gvW2-fcBvFm', 'scsi-0QEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052', 'scsi-SQEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954669 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e', 'scsi-SQEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954684 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-41-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-23 19:25:13.954694 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.954704 | orchestrator | 2025-09-23 19:25:13.954713 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2025-09-23 19:25:13.954723 | orchestrator | Tuesday 23 September 2025 19:23:19 +0000 (0:00:00.669) 0:00:16.960 ***** 2025-09-23 19:25:13.954733 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e', 'dm-uuid-LVM-BYh1we6l1Rbny4mpPNGVfmFVqmlrDTdadBL2afMPVC7aYkeSl0VtWfEEDEItKBqD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954754 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304', 'dm-uuid-LVM-nw8hRIb2eDpdk169y1rdcFUze1XfuOjJllJ9bGkQ0w0EH5YlPs5Idof0C67ssk46'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954765 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262', 'dm-uuid-LVM-NcdhDJBq0TBcw9ePnA00uXvA5tL30WE3Z4S8MCdepjendah0VppDSjGgz9nPXIRI'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954775 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954790 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee', 'dm-uuid-LVM-XJfizJhV9UhWBv2FwGTBmsjdeRQx0bAnGRp8GEaT01hL7vlh46uUFrtFT6WiLhoZ'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954800 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954810 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954829 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954840 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954850 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954860 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954876 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954887 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954905 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954916 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954926 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954936 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954946 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954962 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954972 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.954996 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16', 'scsi-SQEMU_QEMU_HARDDISK_0e48d10f-7bad-48f6-8de6-4bf624069e37-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955034 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16', 'scsi-SQEMU_QEMU_HARDDISK_3daf4c4f-0b25-4d1b-b6fd-5bc1afbef40f-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955053 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e-osd--block--ffaf3874--fb75--58cf--9cbc--48a6d8d7ea6e'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-qwNbwq-ZWIw-gtu3-bkEl-T6U4-liMO-iGzhMR', 'scsi-0QEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5', 'scsi-SQEMU_QEMU_HARDDISK_c7f54fe7-669c-4c8c-8645-aaee9eb7e9c5'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955064 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--1c8984fd--f811--541c--8648--d34ada8a5304-osd--block--1c8984fd--f811--541c--8648--d34ada8a5304'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-A4Gl3a-uEFo-1YjV-onOt-lDti-Rblb-3dFZee', 'scsi-0QEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85', 'scsi-SQEMU_QEMU_HARDDISK_ad3d32bb-3e57-4330-95b4-3d115fcffc85'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955080 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--ad3a695b--9edf--562e--89c9--18fadd13d262-osd--block--ad3a695b--9edf--562e--89c9--18fadd13d262'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-2FuINB-zcer-4mIL-BOFU-w1dA-hCsm-AWvBtO', 'scsi-0QEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd', 'scsi-SQEMU_QEMU_HARDDISK_d82469de-3742-489b-9a9c-b38cbdf5a8bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955090 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--8028f60e--1a44--5536--9db2--40f94e230aee-osd--block--8028f60e--1a44--5536--9db2--40f94e230aee'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Ke1zo1-HE7e-DXga-aXdS-u4PO-3JOJ-cGNfpd', 'scsi-0QEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd', 'scsi-SQEMU_QEMU_HARDDISK_2f832cfd-0250-47f3-a635-d697408042bd'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955109 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6', 'scsi-SQEMU_QEMU_HARDDISK_8164be3f-bf64-45a9-9145-7091701f0cb6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955120 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8', 'scsi-SQEMU_QEMU_HARDDISK_e110ce94-ffdd-4a74-bff5-0dc6d68dc0c8'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955130 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-43-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955145 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-40-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955161 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.955171 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.955181 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5', 'dm-uuid-LVM-cTPR2qR6Zc8oAkE17BbZLrodQs1QMSHCIyAIczA6d59xBSXvG9KA9cu5ghiYSaro'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955195 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42', 'dm-uuid-LVM-QxliPBJmTpLitQexep3vAZAasAjeKSby8Zpqm5RSUCw8quKD9lV8fEk8m3kUJSyu'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955206 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955216 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955226 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955240 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955256 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955270 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955281 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955291 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955308 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16', 'scsi-SQEMU_QEMU_HARDDISK_3a8bf4eb-6835-436a-8a3d-3e86e0ef5705-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955329 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5-osd--block--ecd11808--f35b--5e5a--be1d--5423ee6ce3c5'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-E2n78L-aS7J-rCCR-K0MN-C7Uz-Tc8Z-2fbQrV', 'scsi-0QEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b', 'scsi-SQEMU_QEMU_HARDDISK_d7d70b4c-e10d-4821-8a70-30b75615b27b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955340 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--a2ccb3fa--3e8c--5172--95cb--7cae39233d42-osd--block--a2ccb3fa--3e8c--5172--95cb--7cae39233d42'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-zZMqce-nVYl-F3hw-V2eM-fbVa-gvW2-fcBvFm', 'scsi-0QEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052', 'scsi-SQEMU_QEMU_HARDDISK_8202d0db-f0b8-43bb-b5ae-a89817ca1052'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955350 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e', 'scsi-SQEMU_QEMU_HARDDISK_fd93bf64-ef7b-4aa6-93bf-ba25b3acca1e'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955366 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-23-18-30-41-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-23 19:25:13.955382 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.955391 | orchestrator | 2025-09-23 19:25:13.955401 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2025-09-23 19:25:13.955411 | orchestrator | Tuesday 23 September 2025 19:23:19 +0000 (0:00:00.879) 0:00:17.839 ***** 2025-09-23 19:25:13.955421 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.955430 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.955440 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.955449 | orchestrator | 2025-09-23 19:25:13.955459 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2025-09-23 19:25:13.955468 | orchestrator | Tuesday 23 September 2025 19:23:20 +0000 (0:00:00.691) 0:00:18.530 ***** 2025-09-23 19:25:13.955477 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.955487 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.955496 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.955505 | orchestrator | 2025-09-23 19:25:13.955515 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-23 19:25:13.955524 | orchestrator | Tuesday 23 September 2025 19:23:21 +0000 (0:00:00.476) 0:00:19.006 ***** 2025-09-23 19:25:13.955534 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.955543 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.955552 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.955562 | orchestrator | 2025-09-23 19:25:13.955571 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-23 19:25:13.955580 | orchestrator | Tuesday 23 September 2025 19:23:21 +0000 (0:00:00.662) 0:00:19.669 ***** 2025-09-23 19:25:13.955594 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.955603 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.955613 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.955622 | orchestrator | 2025-09-23 19:25:13.955632 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-23 19:25:13.955641 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:00.295) 0:00:19.965 ***** 2025-09-23 19:25:13.955651 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.955660 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.955669 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.955678 | orchestrator | 2025-09-23 19:25:13.955688 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-23 19:25:13.955697 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:00.418) 0:00:20.383 ***** 2025-09-23 19:25:13.955707 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.955716 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.955725 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.955735 | orchestrator | 2025-09-23 19:25:13.955744 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2025-09-23 19:25:13.955753 | orchestrator | Tuesday 23 September 2025 19:23:22 +0000 (0:00:00.482) 0:00:20.866 ***** 2025-09-23 19:25:13.955763 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2025-09-23 19:25:13.955773 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2025-09-23 19:25:13.955782 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2025-09-23 19:25:13.955791 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2025-09-23 19:25:13.955800 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2025-09-23 19:25:13.955810 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2025-09-23 19:25:13.955819 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2025-09-23 19:25:13.955834 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2025-09-23 19:25:13.955843 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2025-09-23 19:25:13.955853 | orchestrator | 2025-09-23 19:25:13.955862 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2025-09-23 19:25:13.955872 | orchestrator | Tuesday 23 September 2025 19:23:23 +0000 (0:00:00.980) 0:00:21.846 ***** 2025-09-23 19:25:13.955881 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-23 19:25:13.955890 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-23 19:25:13.955900 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-23 19:25:13.955909 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.955919 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-09-23 19:25:13.955928 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-09-23 19:25:13.955937 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-09-23 19:25:13.955947 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.955956 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-09-23 19:25:13.955965 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-09-23 19:25:13.955975 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-09-23 19:25:13.955984 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.955994 | orchestrator | 2025-09-23 19:25:13.956003 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2025-09-23 19:25:13.956028 | orchestrator | Tuesday 23 September 2025 19:23:24 +0000 (0:00:00.428) 0:00:22.275 ***** 2025-09-23 19:25:13.956038 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:25:13.956048 | orchestrator | 2025-09-23 19:25:13.956058 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2025-09-23 19:25:13.956067 | orchestrator | Tuesday 23 September 2025 19:23:25 +0000 (0:00:00.678) 0:00:22.954 ***** 2025-09-23 19:25:13.956077 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956086 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.956096 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.956105 | orchestrator | 2025-09-23 19:25:13.956119 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2025-09-23 19:25:13.956129 | orchestrator | Tuesday 23 September 2025 19:23:25 +0000 (0:00:00.317) 0:00:23.271 ***** 2025-09-23 19:25:13.956139 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956148 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.956157 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.956166 | orchestrator | 2025-09-23 19:25:13.956176 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2025-09-23 19:25:13.956185 | orchestrator | Tuesday 23 September 2025 19:23:25 +0000 (0:00:00.302) 0:00:23.574 ***** 2025-09-23 19:25:13.956195 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956204 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.956213 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:25:13.956223 | orchestrator | 2025-09-23 19:25:13.956232 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2025-09-23 19:25:13.956242 | orchestrator | Tuesday 23 September 2025 19:23:26 +0000 (0:00:00.344) 0:00:23.919 ***** 2025-09-23 19:25:13.956251 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.956261 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.956270 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.956279 | orchestrator | 2025-09-23 19:25:13.956289 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2025-09-23 19:25:13.956298 | orchestrator | Tuesday 23 September 2025 19:23:26 +0000 (0:00:00.609) 0:00:24.528 ***** 2025-09-23 19:25:13.956308 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:25:13.956323 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:25:13.956332 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:25:13.956341 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956351 | orchestrator | 2025-09-23 19:25:13.956360 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2025-09-23 19:25:13.956377 | orchestrator | Tuesday 23 September 2025 19:23:26 +0000 (0:00:00.352) 0:00:24.880 ***** 2025-09-23 19:25:13.956386 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:25:13.956396 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:25:13.956405 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:25:13.956415 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956424 | orchestrator | 2025-09-23 19:25:13.956433 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2025-09-23 19:25:13.956443 | orchestrator | Tuesday 23 September 2025 19:23:27 +0000 (0:00:00.354) 0:00:25.235 ***** 2025-09-23 19:25:13.956452 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-23 19:25:13.956462 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-23 19:25:13.956471 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-23 19:25:13.956480 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956490 | orchestrator | 2025-09-23 19:25:13.956499 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2025-09-23 19:25:13.956508 | orchestrator | Tuesday 23 September 2025 19:23:27 +0000 (0:00:00.371) 0:00:25.606 ***** 2025-09-23 19:25:13.956518 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:25:13.956527 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:25:13.956536 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:25:13.956554 | orchestrator | 2025-09-23 19:25:13.956572 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2025-09-23 19:25:13.956588 | orchestrator | Tuesday 23 September 2025 19:23:28 +0000 (0:00:00.372) 0:00:25.978 ***** 2025-09-23 19:25:13.956603 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-23 19:25:13.956619 | orchestrator | ok: [testbed-node-4] => (item=0) 2025-09-23 19:25:13.956635 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-23 19:25:13.956651 | orchestrator | 2025-09-23 19:25:13.956668 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2025-09-23 19:25:13.956684 | orchestrator | Tuesday 23 September 2025 19:23:28 +0000 (0:00:00.618) 0:00:26.597 ***** 2025-09-23 19:25:13.956701 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:25:13.956711 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:25:13.956720 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:25:13.956730 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2025-09-23 19:25:13.956739 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-23 19:25:13.956749 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-23 19:25:13.956758 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-23 19:25:13.956767 | orchestrator | 2025-09-23 19:25:13.956777 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2025-09-23 19:25:13.956786 | orchestrator | Tuesday 23 September 2025 19:23:29 +0000 (0:00:00.964) 0:00:27.561 ***** 2025-09-23 19:25:13.956796 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-23 19:25:13.956805 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-23 19:25:13.956814 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-23 19:25:13.956824 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2025-09-23 19:25:13.956840 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-23 19:25:13.956850 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-23 19:25:13.956860 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-23 19:25:13.956869 | orchestrator | 2025-09-23 19:25:13.956884 | orchestrator | TASK [Include tasks from the ceph-osd role] ************************************ 2025-09-23 19:25:13.956894 | orchestrator | Tuesday 23 September 2025 19:23:31 +0000 (0:00:02.034) 0:00:29.596 ***** 2025-09-23 19:25:13.956904 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:25:13.956913 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:25:13.956923 | orchestrator | included: /ansible/tasks/openstack_config.yml for testbed-node-5 2025-09-23 19:25:13.956932 | orchestrator | 2025-09-23 19:25:13.956941 | orchestrator | TASK [create openstack pool(s)] ************************************************ 2025-09-23 19:25:13.956951 | orchestrator | Tuesday 23 September 2025 19:23:32 +0000 (0:00:00.373) 0:00:29.970 ***** 2025-09-23 19:25:13.956961 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'backups', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:25:13.956972 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'volumes', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:25:13.956988 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'images', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:25:13.956998 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'metrics', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:25:13.957008 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'vms', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-23 19:25:13.957064 | orchestrator | 2025-09-23 19:25:13.957075 | orchestrator | TASK [generate keys] *********************************************************** 2025-09-23 19:25:13.957084 | orchestrator | Tuesday 23 September 2025 19:24:17 +0000 (0:00:45.314) 0:01:15.285 ***** 2025-09-23 19:25:13.957094 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957103 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957112 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957122 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957131 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957141 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957150 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] }}] 2025-09-23 19:25:13.957159 | orchestrator | 2025-09-23 19:25:13.957169 | orchestrator | TASK [get keys from monitors] ************************************************** 2025-09-23 19:25:13.957178 | orchestrator | Tuesday 23 September 2025 19:24:41 +0000 (0:00:23.945) 0:01:39.231 ***** 2025-09-23 19:25:13.957188 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957204 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957214 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957223 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957232 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957242 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957251 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-23 19:25:13.957261 | orchestrator | 2025-09-23 19:25:13.957270 | orchestrator | TASK [copy ceph key(s) if needed] ********************************************** 2025-09-23 19:25:13.957279 | orchestrator | Tuesday 23 September 2025 19:24:53 +0000 (0:00:11.943) 0:01:51.174 ***** 2025-09-23 19:25:13.957289 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957298 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:25:13.957308 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:25:13.957317 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957326 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:25:13.957336 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:25:13.957351 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957361 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:25:13.957370 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:25:13.957380 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957389 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:25:13.957398 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:25:13.957408 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957417 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:25:13.957427 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:25:13.957436 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-23 19:25:13.957445 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-23 19:25:13.957455 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-23 19:25:13.957464 | orchestrator | changed: [testbed-node-5 -> {{ item.1 }}] 2025-09-23 19:25:13.957474 | orchestrator | 2025-09-23 19:25:13.957483 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:25:13.957492 | orchestrator | testbed-node-3 : ok=25  changed=0 unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2025-09-23 19:25:13.957505 | orchestrator | testbed-node-4 : ok=18  changed=0 unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2025-09-23 19:25:13.957513 | orchestrator | testbed-node-5 : ok=23  changed=3  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2025-09-23 19:25:13.957521 | orchestrator | 2025-09-23 19:25:13.957529 | orchestrator | 2025-09-23 19:25:13.957537 | orchestrator | 2025-09-23 19:25:13.957544 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:25:13.957552 | orchestrator | Tuesday 23 September 2025 19:25:10 +0000 (0:00:17.348) 0:02:08.522 ***** 2025-09-23 19:25:13.957560 | orchestrator | =============================================================================== 2025-09-23 19:25:13.957573 | orchestrator | create openstack pool(s) ----------------------------------------------- 45.31s 2025-09-23 19:25:13.957581 | orchestrator | generate keys ---------------------------------------------------------- 23.95s 2025-09-23 19:25:13.957589 | orchestrator | copy ceph key(s) if needed --------------------------------------------- 17.35s 2025-09-23 19:25:13.957597 | orchestrator | get keys from monitors ------------------------------------------------- 11.94s 2025-09-23 19:25:13.957604 | orchestrator | ceph-facts : Find a running mon container ------------------------------- 2.25s 2025-09-23 19:25:13.957612 | orchestrator | ceph-facts : Set_fact ceph_admin_command -------------------------------- 2.03s 2025-09-23 19:25:13.957620 | orchestrator | ceph-facts : Get current fsid if cluster is already running ------------- 1.77s 2025-09-23 19:25:13.957628 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 0.98s 2025-09-23 19:25:13.957635 | orchestrator | ceph-facts : Set_fact ceph_run_cmd -------------------------------------- 0.96s 2025-09-23 19:25:13.957643 | orchestrator | ceph-facts : Set_fact devices generate device list when osd_auto_discovery --- 0.88s 2025-09-23 19:25:13.957651 | orchestrator | ceph-facts : Check if the ceph mon socket is in-use --------------------- 0.84s 2025-09-23 19:25:13.957659 | orchestrator | ceph-facts : Check if podman binary is present -------------------------- 0.75s 2025-09-23 19:25:13.957667 | orchestrator | ceph-facts : Check if the ceph conf exists ------------------------------ 0.69s 2025-09-23 19:25:13.957674 | orchestrator | ceph-facts : Check if it is atomic host --------------------------------- 0.68s 2025-09-23 19:25:13.957682 | orchestrator | ceph-facts : Import_tasks set_radosgw_address.yml ----------------------- 0.68s 2025-09-23 19:25:13.957690 | orchestrator | ceph-facts : Collect existed devices ------------------------------------ 0.67s 2025-09-23 19:25:13.957698 | orchestrator | ceph-facts : Read osd pool default crush rule --------------------------- 0.66s 2025-09-23 19:25:13.957706 | orchestrator | ceph-facts : Set_fact monitor_name ansible_facts['hostname'] ------------ 0.63s 2025-09-23 19:25:13.957713 | orchestrator | ceph-facts : Include facts.yml ------------------------------------------ 0.62s 2025-09-23 19:25:13.957721 | orchestrator | ceph-facts : Set_fact rgw_instances ------------------------------------- 0.62s 2025-09-23 19:25:13.957729 | orchestrator | 2025-09-23 19:25:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:13.957737 | orchestrator | 2025-09-23 19:25:13 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:13.957745 | orchestrator | 2025-09-23 19:25:13 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:13.957752 | orchestrator | 2025-09-23 19:25:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:16.994736 | orchestrator | 2025-09-23 19:25:16 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:16.997074 | orchestrator | 2025-09-23 19:25:16 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:16.998802 | orchestrator | 2025-09-23 19:25:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:17.000850 | orchestrator | 2025-09-23 19:25:17 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:17.002342 | orchestrator | 2025-09-23 19:25:17 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:17.002625 | orchestrator | 2025-09-23 19:25:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:20.047757 | orchestrator | 2025-09-23 19:25:20 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:20.049258 | orchestrator | 2025-09-23 19:25:20 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:20.051000 | orchestrator | 2025-09-23 19:25:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:20.052211 | orchestrator | 2025-09-23 19:25:20 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:20.053704 | orchestrator | 2025-09-23 19:25:20 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:20.054105 | orchestrator | 2025-09-23 19:25:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:23.102644 | orchestrator | 2025-09-23 19:25:23 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:23.103978 | orchestrator | 2025-09-23 19:25:23 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:23.105939 | orchestrator | 2025-09-23 19:25:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:23.107761 | orchestrator | 2025-09-23 19:25:23 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:23.109347 | orchestrator | 2025-09-23 19:25:23 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:23.109629 | orchestrator | 2025-09-23 19:25:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:26.159918 | orchestrator | 2025-09-23 19:25:26 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:26.162397 | orchestrator | 2025-09-23 19:25:26 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:26.165103 | orchestrator | 2025-09-23 19:25:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:26.168003 | orchestrator | 2025-09-23 19:25:26 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:26.170240 | orchestrator | 2025-09-23 19:25:26 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:26.170317 | orchestrator | 2025-09-23 19:25:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:29.220106 | orchestrator | 2025-09-23 19:25:29 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:29.220698 | orchestrator | 2025-09-23 19:25:29 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:29.222385 | orchestrator | 2025-09-23 19:25:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:29.223711 | orchestrator | 2025-09-23 19:25:29 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:29.225219 | orchestrator | 2025-09-23 19:25:29 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:29.225255 | orchestrator | 2025-09-23 19:25:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:32.259399 | orchestrator | 2025-09-23 19:25:32 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:32.260881 | orchestrator | 2025-09-23 19:25:32 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:32.262247 | orchestrator | 2025-09-23 19:25:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:32.263764 | orchestrator | 2025-09-23 19:25:32 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:32.265565 | orchestrator | 2025-09-23 19:25:32 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:32.265591 | orchestrator | 2025-09-23 19:25:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:35.305256 | orchestrator | 2025-09-23 19:25:35 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state STARTED 2025-09-23 19:25:35.305580 | orchestrator | 2025-09-23 19:25:35 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:35.307593 | orchestrator | 2025-09-23 19:25:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:35.308813 | orchestrator | 2025-09-23 19:25:35 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state STARTED 2025-09-23 19:25:35.309531 | orchestrator | 2025-09-23 19:25:35 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:35.309561 | orchestrator | 2025-09-23 19:25:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:38.350956 | orchestrator | 2025-09-23 19:25:38 | INFO  | Task fb9dccd4-b8dc-4861-a1aa-43aa4a0bfc62 is in state SUCCESS 2025-09-23 19:25:38.353153 | orchestrator | 2025-09-23 19:25:38 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:38.354669 | orchestrator | 2025-09-23 19:25:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:38.356811 | orchestrator | 2025-09-23 19:25:38 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:38.357718 | orchestrator | 2025-09-23 19:25:38 | INFO  | Task 6043c97e-d7ce-4df9-ac8e-d2f4c907d4cf is in state SUCCESS 2025-09-23 19:25:38.359325 | orchestrator | 2025-09-23 19:25:38 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:38.359350 | orchestrator | 2025-09-23 19:25:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:41.397603 | orchestrator | 2025-09-23 19:25:41 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:41.399132 | orchestrator | 2025-09-23 19:25:41 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:41.401119 | orchestrator | 2025-09-23 19:25:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:41.402651 | orchestrator | 2025-09-23 19:25:41 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:41.404270 | orchestrator | 2025-09-23 19:25:41 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state STARTED 2025-09-23 19:25:41.404443 | orchestrator | 2025-09-23 19:25:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:44.444092 | orchestrator | 2025-09-23 19:25:44 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:44.445510 | orchestrator | 2025-09-23 19:25:44 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state STARTED 2025-09-23 19:25:44.446741 | orchestrator | 2025-09-23 19:25:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:44.448247 | orchestrator | 2025-09-23 19:25:44 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:44.449710 | orchestrator | 2025-09-23 19:25:44.449736 | orchestrator | 2025-09-23 19:25:44.449748 | orchestrator | PLAY [Download ironic ipa images] ********************************************** 2025-09-23 19:25:44.449760 | orchestrator | 2025-09-23 19:25:44.449772 | orchestrator | TASK [Ensure the destination directory exists] ********************************* 2025-09-23 19:25:44.449783 | orchestrator | Tuesday 23 September 2025 19:23:51 +0000 (0:00:00.090) 0:00:00.090 ***** 2025-09-23 19:25:44.449794 | orchestrator | changed: [localhost] 2025-09-23 19:25:44.449806 | orchestrator | 2025-09-23 19:25:44.449817 | orchestrator | TASK [Download ironic-agent initramfs] ***************************************** 2025-09-23 19:25:44.449828 | orchestrator | Tuesday 23 September 2025 19:23:52 +0000 (0:00:00.934) 0:00:01.024 ***** 2025-09-23 19:25:44.449839 | orchestrator | FAILED - RETRYING: [localhost]: Download ironic-agent initramfs (3 retries left). 2025-09-23 19:25:44.449850 | orchestrator | changed: [localhost] 2025-09-23 19:25:44.449885 | orchestrator | 2025-09-23 19:25:44.449897 | orchestrator | TASK [Download ironic-agent kernel] ******************************************** 2025-09-23 19:25:44.449908 | orchestrator | Tuesday 23 September 2025 19:24:49 +0000 (0:00:56.652) 0:00:57.677 ***** 2025-09-23 19:25:44.449918 | orchestrator | FAILED - RETRYING: [localhost]: Download ironic-agent kernel (3 retries left). 2025-09-23 19:25:44.449929 | orchestrator | FAILED - RETRYING: [localhost]: Download ironic-agent kernel (2 retries left). 2025-09-23 19:25:44.449940 | orchestrator | changed: [localhost] 2025-09-23 19:25:44.449951 | orchestrator | 2025-09-23 19:25:44.449962 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:25:44.449973 | orchestrator | 2025-09-23 19:25:44.449984 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:25:44.449995 | orchestrator | Tuesday 23 September 2025 19:25:35 +0000 (0:00:46.103) 0:01:43.780 ***** 2025-09-23 19:25:44.450077 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:25:44.450089 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:25:44.450100 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:25:44.450111 | orchestrator | 2025-09-23 19:25:44.450122 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:25:44.450132 | orchestrator | Tuesday 23 September 2025 19:25:35 +0000 (0:00:00.277) 0:01:44.057 ***** 2025-09-23 19:25:44.450143 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: enable_ironic_True 2025-09-23 19:25:44.450154 | orchestrator | ok: [testbed-node-0] => (item=enable_ironic_False) 2025-09-23 19:25:44.450165 | orchestrator | ok: [testbed-node-1] => (item=enable_ironic_False) 2025-09-23 19:25:44.450176 | orchestrator | ok: [testbed-node-2] => (item=enable_ironic_False) 2025-09-23 19:25:44.450186 | orchestrator | 2025-09-23 19:25:44.450197 | orchestrator | PLAY [Apply role ironic] ******************************************************* 2025-09-23 19:25:44.450208 | orchestrator | skipping: no hosts matched 2025-09-23 19:25:44.450219 | orchestrator | 2025-09-23 19:25:44.450230 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:25:44.450241 | orchestrator | localhost : ok=3  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.450253 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.450265 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.450276 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.450286 | orchestrator | 2025-09-23 19:25:44.450297 | orchestrator | 2025-09-23 19:25:44.450309 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:25:44.450320 | orchestrator | Tuesday 23 September 2025 19:25:35 +0000 (0:00:00.360) 0:01:44.418 ***** 2025-09-23 19:25:44.450333 | orchestrator | =============================================================================== 2025-09-23 19:25:44.450346 | orchestrator | Download ironic-agent initramfs ---------------------------------------- 56.65s 2025-09-23 19:25:44.450370 | orchestrator | Download ironic-agent kernel ------------------------------------------- 46.10s 2025-09-23 19:25:44.450383 | orchestrator | Ensure the destination directory exists --------------------------------- 0.93s 2025-09-23 19:25:44.450395 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.36s 2025-09-23 19:25:44.450407 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.28s 2025-09-23 19:25:44.450419 | orchestrator | 2025-09-23 19:25:44.450432 | orchestrator | 2025-09-23 19:25:44.450444 | orchestrator | PLAY [Copy ceph keys to the configuration repository] ************************** 2025-09-23 19:25:44.450457 | orchestrator | 2025-09-23 19:25:44.450469 | orchestrator | TASK [Fetch all ceph keys] ***************************************************** 2025-09-23 19:25:44.450481 | orchestrator | Tuesday 23 September 2025 19:25:14 +0000 (0:00:00.159) 0:00:00.159 ***** 2025-09-23 19:25:44.450502 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2025-09-23 19:25:44.450515 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450527 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450540 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2025-09-23 19:25:44.450552 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450565 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2025-09-23 19:25:44.450590 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2025-09-23 19:25:44.450603 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2025-09-23 19:25:44.450615 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2025-09-23 19:25:44.450626 | orchestrator | 2025-09-23 19:25:44.450638 | orchestrator | TASK [Create share directory] ************************************************** 2025-09-23 19:25:44.450650 | orchestrator | Tuesday 23 September 2025 19:25:18 +0000 (0:00:04.240) 0:00:04.400 ***** 2025-09-23 19:25:44.450663 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-23 19:25:44.450675 | orchestrator | 2025-09-23 19:25:44.450688 | orchestrator | TASK [Write ceph keys to the share directory] ********************************** 2025-09-23 19:25:44.450700 | orchestrator | Tuesday 23 September 2025 19:25:19 +0000 (0:00:00.912) 0:00:05.312 ***** 2025-09-23 19:25:44.450711 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.admin.keyring) 2025-09-23 19:25:44.450721 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450732 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450743 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder-backup.keyring) 2025-09-23 19:25:44.450754 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450764 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.nova.keyring) 2025-09-23 19:25:44.450775 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.glance.keyring) 2025-09-23 19:25:44.450787 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.gnocchi.keyring) 2025-09-23 19:25:44.450797 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.manila.keyring) 2025-09-23 19:25:44.450808 | orchestrator | 2025-09-23 19:25:44.450819 | orchestrator | TASK [Write ceph keys to the configuration directory] ************************** 2025-09-23 19:25:44.450829 | orchestrator | Tuesday 23 September 2025 19:25:31 +0000 (0:00:11.624) 0:00:16.936 ***** 2025-09-23 19:25:44.450840 | orchestrator | changed: [testbed-manager] => (item=ceph.client.admin.keyring) 2025-09-23 19:25:44.450851 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450861 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450872 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder-backup.keyring) 2025-09-23 19:25:44.450883 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2025-09-23 19:25:44.450894 | orchestrator | changed: [testbed-manager] => (item=ceph.client.nova.keyring) 2025-09-23 19:25:44.450905 | orchestrator | changed: [testbed-manager] => (item=ceph.client.glance.keyring) 2025-09-23 19:25:44.450915 | orchestrator | changed: [testbed-manager] => (item=ceph.client.gnocchi.keyring) 2025-09-23 19:25:44.450926 | orchestrator | changed: [testbed-manager] => (item=ceph.client.manila.keyring) 2025-09-23 19:25:44.450944 | orchestrator | 2025-09-23 19:25:44.450955 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:25:44.450966 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.450977 | orchestrator | 2025-09-23 19:25:44.450988 | orchestrator | 2025-09-23 19:25:44.451013 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:25:44.451024 | orchestrator | Tuesday 23 September 2025 19:25:37 +0000 (0:00:05.956) 0:00:22.892 ***** 2025-09-23 19:25:44.451035 | orchestrator | =============================================================================== 2025-09-23 19:25:44.451046 | orchestrator | Write ceph keys to the share directory --------------------------------- 11.62s 2025-09-23 19:25:44.451061 | orchestrator | Write ceph keys to the configuration directory -------------------------- 5.96s 2025-09-23 19:25:44.451072 | orchestrator | Fetch all ceph keys ----------------------------------------------------- 4.24s 2025-09-23 19:25:44.451083 | orchestrator | Create share directory -------------------------------------------------- 0.91s 2025-09-23 19:25:44.451094 | orchestrator | 2025-09-23 19:25:44.451104 | orchestrator | 2025-09-23 19:25:44.451115 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:25:44.451125 | orchestrator | 2025-09-23 19:25:44.451136 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:25:44.451147 | orchestrator | Tuesday 23 September 2025 19:24:49 +0000 (0:00:00.198) 0:00:00.198 ***** 2025-09-23 19:25:44.451158 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:25:44.451168 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:25:44.451179 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:25:44.451189 | orchestrator | 2025-09-23 19:25:44.451200 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:25:44.451210 | orchestrator | Tuesday 23 September 2025 19:24:49 +0000 (0:00:00.222) 0:00:00.421 ***** 2025-09-23 19:25:44.451221 | orchestrator | ok: [testbed-node-0] => (item=enable_placement_True) 2025-09-23 19:25:44.451232 | orchestrator | ok: [testbed-node-1] => (item=enable_placement_True) 2025-09-23 19:25:44.451242 | orchestrator | ok: [testbed-node-2] => (item=enable_placement_True) 2025-09-23 19:25:44.451253 | orchestrator | 2025-09-23 19:25:44.451263 | orchestrator | PLAY [Apply role placement] **************************************************** 2025-09-23 19:25:44.451274 | orchestrator | 2025-09-23 19:25:44.451284 | orchestrator | TASK [placement : include_tasks] *********************************************** 2025-09-23 19:25:44.451295 | orchestrator | Tuesday 23 September 2025 19:24:49 +0000 (0:00:00.295) 0:00:00.717 ***** 2025-09-23 19:25:44.451306 | orchestrator | included: /ansible/roles/placement/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:25:44.451316 | orchestrator | 2025-09-23 19:25:44.451333 | orchestrator | TASK [service-ks-register : placement | Creating services] ********************* 2025-09-23 19:25:44.451344 | orchestrator | Tuesday 23 September 2025 19:24:50 +0000 (0:00:00.498) 0:00:01.215 ***** 2025-09-23 19:25:44.451355 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (5 retries left). 2025-09-23 19:25:44.451366 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (4 retries left). 2025-09-23 19:25:44.451377 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (3 retries left). 2025-09-23 19:25:44.451387 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (2 retries left). 2025-09-23 19:25:44.451398 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (1 retries left). 2025-09-23 19:25:44.451410 | orchestrator | failed: [testbed-node-0] (item=placement (placement)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Placement Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:8780"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:8780"}], "name": "placement", "type": "placement"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:25:44.451432 | orchestrator | 2025-09-23 19:25:44.451443 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:25:44.451454 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.451465 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.451476 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:25:44.451486 | orchestrator | 2025-09-23 19:25:44.451497 | orchestrator | 2025-09-23 19:25:44.451508 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:25:44.451519 | orchestrator | Tuesday 23 September 2025 19:25:43 +0000 (0:00:53.308) 0:00:54.524 ***** 2025-09-23 19:25:44.451529 | orchestrator | =============================================================================== 2025-09-23 19:25:44.451540 | orchestrator | service-ks-register : placement | Creating services -------------------- 53.31s 2025-09-23 19:25:44.451551 | orchestrator | placement : include_tasks ----------------------------------------------- 0.50s 2025-09-23 19:25:44.451561 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.30s 2025-09-23 19:25:44.451572 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.22s 2025-09-23 19:25:44.451583 | orchestrator | 2025-09-23 19:25:44 | INFO  | Task 1d41da56-e45d-4aee-9382-ab25c82f3001 is in state SUCCESS 2025-09-23 19:25:44.451594 | orchestrator | 2025-09-23 19:25:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:47.488053 | orchestrator | 2025-09-23 19:25:47 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:47.488851 | orchestrator | 2025-09-23 19:25:47 | INFO  | Task e645372f-1622-4193-90aa-4c62b3982481 is in state SUCCESS 2025-09-23 19:25:47.490191 | orchestrator | 2025-09-23 19:25:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:47.490819 | orchestrator | 2025-09-23 19:25:47 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:47.490847 | orchestrator | 2025-09-23 19:25:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:50.525346 | orchestrator | 2025-09-23 19:25:50 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:50.526350 | orchestrator | 2025-09-23 19:25:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:50.527942 | orchestrator | 2025-09-23 19:25:50 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:50.528234 | orchestrator | 2025-09-23 19:25:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:53.569068 | orchestrator | 2025-09-23 19:25:53 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:53.573690 | orchestrator | 2025-09-23 19:25:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:53.575480 | orchestrator | 2025-09-23 19:25:53 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:53.575507 | orchestrator | 2025-09-23 19:25:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:56.619800 | orchestrator | 2025-09-23 19:25:56 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:56.620990 | orchestrator | 2025-09-23 19:25:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:56.623728 | orchestrator | 2025-09-23 19:25:56 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:56.623787 | orchestrator | 2025-09-23 19:25:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:25:59.669974 | orchestrator | 2025-09-23 19:25:59 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:25:59.671857 | orchestrator | 2025-09-23 19:25:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:25:59.672991 | orchestrator | 2025-09-23 19:25:59 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:25:59.673080 | orchestrator | 2025-09-23 19:25:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:02.712876 | orchestrator | 2025-09-23 19:26:02 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:02.713290 | orchestrator | 2025-09-23 19:26:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:02.714467 | orchestrator | 2025-09-23 19:26:02 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:02.714497 | orchestrator | 2025-09-23 19:26:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:05.764528 | orchestrator | 2025-09-23 19:26:05 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:05.765352 | orchestrator | 2025-09-23 19:26:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:05.768182 | orchestrator | 2025-09-23 19:26:05 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:05.768283 | orchestrator | 2025-09-23 19:26:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:08.816697 | orchestrator | 2025-09-23 19:26:08 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:08.819150 | orchestrator | 2025-09-23 19:26:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:08.821220 | orchestrator | 2025-09-23 19:26:08 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:08.821244 | orchestrator | 2025-09-23 19:26:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:11.864410 | orchestrator | 2025-09-23 19:26:11 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:11.865915 | orchestrator | 2025-09-23 19:26:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:11.867823 | orchestrator | 2025-09-23 19:26:11 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:11.867896 | orchestrator | 2025-09-23 19:26:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:14.919671 | orchestrator | 2025-09-23 19:26:14 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:14.920372 | orchestrator | 2025-09-23 19:26:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:14.922087 | orchestrator | 2025-09-23 19:26:14 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:14.922163 | orchestrator | 2025-09-23 19:26:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:17.967064 | orchestrator | 2025-09-23 19:26:17 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:17.967482 | orchestrator | 2025-09-23 19:26:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:17.968242 | orchestrator | 2025-09-23 19:26:17 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:17.968282 | orchestrator | 2025-09-23 19:26:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:21.020441 | orchestrator | 2025-09-23 19:26:21 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:21.021449 | orchestrator | 2025-09-23 19:26:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:21.023669 | orchestrator | 2025-09-23 19:26:21 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:21.023694 | orchestrator | 2025-09-23 19:26:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:24.070587 | orchestrator | 2025-09-23 19:26:24 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:24.074220 | orchestrator | 2025-09-23 19:26:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:24.077102 | orchestrator | 2025-09-23 19:26:24 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:24.077187 | orchestrator | 2025-09-23 19:26:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:27.113824 | orchestrator | 2025-09-23 19:26:27 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:27.115869 | orchestrator | 2025-09-23 19:26:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:27.117214 | orchestrator | 2025-09-23 19:26:27 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:27.117256 | orchestrator | 2025-09-23 19:26:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:30.165577 | orchestrator | 2025-09-23 19:26:30 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:30.168083 | orchestrator | 2025-09-23 19:26:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:30.169463 | orchestrator | 2025-09-23 19:26:30 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:30.169495 | orchestrator | 2025-09-23 19:26:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:33.209300 | orchestrator | 2025-09-23 19:26:33 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state STARTED 2025-09-23 19:26:33.211486 | orchestrator | 2025-09-23 19:26:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:33.213254 | orchestrator | 2025-09-23 19:26:33 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state STARTED 2025-09-23 19:26:33.213602 | orchestrator | 2025-09-23 19:26:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:36.260561 | orchestrator | 2025-09-23 19:26:36.260647 | orchestrator | 2025-09-23 19:26:36.260662 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:26:36.260673 | orchestrator | 2025-09-23 19:26:36.260684 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:26:36.260696 | orchestrator | Tuesday 23 September 2025 19:24:50 +0000 (0:00:00.231) 0:00:00.231 ***** 2025-09-23 19:26:36.260707 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:26:36.260718 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:26:36.260729 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:26:36.260739 | orchestrator | 2025-09-23 19:26:36.260750 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:26:36.260760 | orchestrator | Tuesday 23 September 2025 19:24:50 +0000 (0:00:00.246) 0:00:00.478 ***** 2025-09-23 19:26:36.260771 | orchestrator | ok: [testbed-node-0] => (item=enable_magnum_True) 2025-09-23 19:26:36.260782 | orchestrator | ok: [testbed-node-1] => (item=enable_magnum_True) 2025-09-23 19:26:36.260792 | orchestrator | ok: [testbed-node-2] => (item=enable_magnum_True) 2025-09-23 19:26:36.260803 | orchestrator | 2025-09-23 19:26:36.260813 | orchestrator | PLAY [Apply role magnum] ******************************************************* 2025-09-23 19:26:36.260848 | orchestrator | 2025-09-23 19:26:36.260860 | orchestrator | TASK [magnum : include_tasks] ************************************************** 2025-09-23 19:26:36.260871 | orchestrator | Tuesday 23 September 2025 19:24:50 +0000 (0:00:00.325) 0:00:00.803 ***** 2025-09-23 19:26:36.260881 | orchestrator | included: /ansible/roles/magnum/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:26:36.260892 | orchestrator | 2025-09-23 19:26:36.260903 | orchestrator | TASK [service-ks-register : magnum | Creating services] ************************ 2025-09-23 19:26:36.260914 | orchestrator | Tuesday 23 September 2025 19:24:51 +0000 (0:00:00.466) 0:00:01.270 ***** 2025-09-23 19:26:36.261085 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (5 retries left). 2025-09-23 19:26:36.261108 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (4 retries left). 2025-09-23 19:26:36.261120 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (3 retries left). 2025-09-23 19:26:36.261131 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (2 retries left). 2025-09-23 19:26:36.261143 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (1 retries left). 2025-09-23 19:26:36.261157 | orchestrator | failed: [testbed-node-0] (item=magnum (container-infra)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Container Infrastructure Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9511/v1"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9511/v1"}], "name": "magnum", "type": "container-infra"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:26:36.261173 | orchestrator | 2025-09-23 19:26:36.261184 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:26:36.261196 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:26:36.261209 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:26:36.261221 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:26:36.261233 | orchestrator | 2025-09-23 19:26:36.261245 | orchestrator | 2025-09-23 19:26:36.261256 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:26:36.261268 | orchestrator | Tuesday 23 September 2025 19:25:44 +0000 (0:00:53.244) 0:00:54.515 ***** 2025-09-23 19:26:36.261279 | orchestrator | =============================================================================== 2025-09-23 19:26:36.261291 | orchestrator | service-ks-register : magnum | Creating services ----------------------- 53.25s 2025-09-23 19:26:36.261303 | orchestrator | magnum : include_tasks -------------------------------------------------- 0.47s 2025-09-23 19:26:36.261314 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.33s 2025-09-23 19:26:36.261326 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.25s 2025-09-23 19:26:36.261337 | orchestrator | 2025-09-23 19:26:36.261349 | orchestrator | 2025-09-23 19:26:36.261361 | orchestrator | PLAY [Apply role cephclient] *************************************************** 2025-09-23 19:26:36.261373 | orchestrator | 2025-09-23 19:26:36.261384 | orchestrator | TASK [osism.services.cephclient : Include container tasks] ********************* 2025-09-23 19:26:36.261396 | orchestrator | Tuesday 23 September 2025 19:25:41 +0000 (0:00:00.217) 0:00:00.217 ***** 2025-09-23 19:26:36.261408 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/cephclient/tasks/container.yml for testbed-manager 2025-09-23 19:26:36.261420 | orchestrator | 2025-09-23 19:26:36.261431 | orchestrator | TASK [osism.services.cephclient : Create required directories] ***************** 2025-09-23 19:26:36.261443 | orchestrator | Tuesday 23 September 2025 19:25:41 +0000 (0:00:00.226) 0:00:00.444 ***** 2025-09-23 19:26:36.261464 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/configuration) 2025-09-23 19:26:36.261476 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/data) 2025-09-23 19:26:36.261488 | orchestrator | ok: [testbed-manager] => (item=/opt/cephclient) 2025-09-23 19:26:36.261500 | orchestrator | 2025-09-23 19:26:36.261511 | orchestrator | TASK [osism.services.cephclient : Copy configuration files] ******************** 2025-09-23 19:26:36.261540 | orchestrator | Tuesday 23 September 2025 19:25:42 +0000 (0:00:01.038) 0:00:01.483 ***** 2025-09-23 19:26:36.261552 | orchestrator | changed: [testbed-manager] => (item={'src': 'ceph.conf.j2', 'dest': '/opt/cephclient/configuration/ceph.conf'}) 2025-09-23 19:26:36.261564 | orchestrator | 2025-09-23 19:26:36.261576 | orchestrator | TASK [osism.services.cephclient : Copy keyring file] *************************** 2025-09-23 19:26:36.261587 | orchestrator | Tuesday 23 September 2025 19:25:43 +0000 (0:00:00.999) 0:00:02.483 ***** 2025-09-23 19:26:36.261599 | orchestrator | changed: [testbed-manager] 2025-09-23 19:26:36.261611 | orchestrator | 2025-09-23 19:26:36.261622 | orchestrator | TASK [osism.services.cephclient : Copy docker-compose.yml file] **************** 2025-09-23 19:26:36.261634 | orchestrator | Tuesday 23 September 2025 19:25:44 +0000 (0:00:00.920) 0:00:03.403 ***** 2025-09-23 19:26:36.261645 | orchestrator | changed: [testbed-manager] 2025-09-23 19:26:36.261657 | orchestrator | 2025-09-23 19:26:36.261668 | orchestrator | TASK [osism.services.cephclient : Manage cephclient service] ******************* 2025-09-23 19:26:36.261680 | orchestrator | Tuesday 23 September 2025 19:25:45 +0000 (0:00:00.793) 0:00:04.196 ***** 2025-09-23 19:26:36.261691 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage cephclient service (10 retries left). 2025-09-23 19:26:36.261703 | orchestrator | ok: [testbed-manager] 2025-09-23 19:26:36.261716 | orchestrator | 2025-09-23 19:26:36.261729 | orchestrator | TASK [osism.services.cephclient : Copy wrapper scripts] ************************ 2025-09-23 19:26:36.261742 | orchestrator | Tuesday 23 September 2025 19:26:25 +0000 (0:00:40.362) 0:00:44.559 ***** 2025-09-23 19:26:36.261754 | orchestrator | changed: [testbed-manager] => (item=ceph) 2025-09-23 19:26:36.261768 | orchestrator | changed: [testbed-manager] => (item=ceph-authtool) 2025-09-23 19:26:36.261781 | orchestrator | changed: [testbed-manager] => (item=rados) 2025-09-23 19:26:36.261799 | orchestrator | changed: [testbed-manager] => (item=radosgw-admin) 2025-09-23 19:26:36.261812 | orchestrator | changed: [testbed-manager] => (item=rbd) 2025-09-23 19:26:36.261825 | orchestrator | 2025-09-23 19:26:36.261838 | orchestrator | TASK [osism.services.cephclient : Remove old wrapper scripts] ****************** 2025-09-23 19:26:36.261851 | orchestrator | Tuesday 23 September 2025 19:26:29 +0000 (0:00:03.671) 0:00:48.230 ***** 2025-09-23 19:26:36.261864 | orchestrator | ok: [testbed-manager] => (item=crushtool) 2025-09-23 19:26:36.261877 | orchestrator | 2025-09-23 19:26:36.261889 | orchestrator | TASK [osism.services.cephclient : Include package tasks] *********************** 2025-09-23 19:26:36.261902 | orchestrator | Tuesday 23 September 2025 19:26:29 +0000 (0:00:00.402) 0:00:48.632 ***** 2025-09-23 19:26:36.261915 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:26:36.261928 | orchestrator | 2025-09-23 19:26:36.261940 | orchestrator | TASK [osism.services.cephclient : Include rook task] *************************** 2025-09-23 19:26:36.261953 | orchestrator | Tuesday 23 September 2025 19:26:29 +0000 (0:00:00.114) 0:00:48.746 ***** 2025-09-23 19:26:36.261966 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:26:36.261996 | orchestrator | 2025-09-23 19:26:36.262008 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Restart cephclient service] ******* 2025-09-23 19:26:36.262072 | orchestrator | Tuesday 23 September 2025 19:26:30 +0000 (0:00:00.276) 0:00:49.023 ***** 2025-09-23 19:26:36.262085 | orchestrator | changed: [testbed-manager] 2025-09-23 19:26:36.262097 | orchestrator | 2025-09-23 19:26:36.262108 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Ensure that all containers are up] *** 2025-09-23 19:26:36.262118 | orchestrator | Tuesday 23 September 2025 19:26:31 +0000 (0:00:01.640) 0:00:50.663 ***** 2025-09-23 19:26:36.262129 | orchestrator | changed: [testbed-manager] 2025-09-23 19:26:36.262147 | orchestrator | 2025-09-23 19:26:36.262158 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Wait for an healthy service] ****** 2025-09-23 19:26:36.262169 | orchestrator | Tuesday 23 September 2025 19:26:32 +0000 (0:00:00.689) 0:00:51.353 ***** 2025-09-23 19:26:36.262179 | orchestrator | changed: [testbed-manager] 2025-09-23 19:26:36.262279 | orchestrator | 2025-09-23 19:26:36.262296 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Copy bash completion scripts] ***** 2025-09-23 19:26:36.262307 | orchestrator | Tuesday 23 September 2025 19:26:33 +0000 (0:00:00.572) 0:00:51.925 ***** 2025-09-23 19:26:36.262318 | orchestrator | ok: [testbed-manager] => (item=ceph) 2025-09-23 19:26:36.262329 | orchestrator | ok: [testbed-manager] => (item=rados) 2025-09-23 19:26:36.262340 | orchestrator | ok: [testbed-manager] => (item=radosgw-admin) 2025-09-23 19:26:36.262351 | orchestrator | ok: [testbed-manager] => (item=rbd) 2025-09-23 19:26:36.262361 | orchestrator | 2025-09-23 19:26:36.262372 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:26:36.262383 | orchestrator | testbed-manager : ok=12  changed=8  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-23 19:26:36.262394 | orchestrator | 2025-09-23 19:26:36.262404 | orchestrator | 2025-09-23 19:26:36.262415 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:26:36.262426 | orchestrator | Tuesday 23 September 2025 19:26:34 +0000 (0:00:01.338) 0:00:53.263 ***** 2025-09-23 19:26:36.262437 | orchestrator | =============================================================================== 2025-09-23 19:26:36.262447 | orchestrator | osism.services.cephclient : Manage cephclient service ------------------ 40.36s 2025-09-23 19:26:36.262458 | orchestrator | osism.services.cephclient : Copy wrapper scripts ------------------------ 3.67s 2025-09-23 19:26:36.262469 | orchestrator | osism.services.cephclient : Restart cephclient service ------------------ 1.64s 2025-09-23 19:26:36.262479 | orchestrator | osism.services.cephclient : Copy bash completion scripts ---------------- 1.34s 2025-09-23 19:26:36.262490 | orchestrator | osism.services.cephclient : Create required directories ----------------- 1.04s 2025-09-23 19:26:36.262501 | orchestrator | osism.services.cephclient : Copy configuration files -------------------- 1.00s 2025-09-23 19:26:36.262511 | orchestrator | osism.services.cephclient : Copy keyring file --------------------------- 0.92s 2025-09-23 19:26:36.262522 | orchestrator | osism.services.cephclient : Copy docker-compose.yml file ---------------- 0.79s 2025-09-23 19:26:36.262541 | orchestrator | osism.services.cephclient : Ensure that all containers are up ----------- 0.69s 2025-09-23 19:26:36.262552 | orchestrator | osism.services.cephclient : Wait for an healthy service ----------------- 0.57s 2025-09-23 19:26:36.262563 | orchestrator | osism.services.cephclient : Remove old wrapper scripts ------------------ 0.40s 2025-09-23 19:26:36.262574 | orchestrator | osism.services.cephclient : Include rook task --------------------------- 0.28s 2025-09-23 19:26:36.262584 | orchestrator | osism.services.cephclient : Include container tasks --------------------- 0.23s 2025-09-23 19:26:36.262595 | orchestrator | osism.services.cephclient : Include package tasks ----------------------- 0.11s 2025-09-23 19:26:36.262606 | orchestrator | 2025-09-23 19:26:36 | INFO  | Task fa9c902c-14da-4e6f-b5be-b0ee3c2c8359 is in state SUCCESS 2025-09-23 19:26:36.262617 | orchestrator | 2025-09-23 19:26:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:36.262628 | orchestrator | 2025-09-23 19:26:36 | INFO  | Task a6d643b4-63df-4729-b733-3b5934236b96 is in state SUCCESS 2025-09-23 19:26:36.263715 | orchestrator | 2025-09-23 19:26:36 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:36.264905 | orchestrator | 2025-09-23 19:26:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:36.265968 | orchestrator | 2025-09-23 19:26:36 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:36.266152 | orchestrator | 2025-09-23 19:26:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:39.312604 | orchestrator | 2025-09-23 19:26:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:39.313827 | orchestrator | 2025-09-23 19:26:39 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:39.315235 | orchestrator | 2025-09-23 19:26:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:39.316935 | orchestrator | 2025-09-23 19:26:39 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:39.316969 | orchestrator | 2025-09-23 19:26:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:42.361023 | orchestrator | 2025-09-23 19:26:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:42.361378 | orchestrator | 2025-09-23 19:26:42 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:42.362218 | orchestrator | 2025-09-23 19:26:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:42.363139 | orchestrator | 2025-09-23 19:26:42 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:42.363658 | orchestrator | 2025-09-23 19:26:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:45.411202 | orchestrator | 2025-09-23 19:26:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:45.411334 | orchestrator | 2025-09-23 19:26:45 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:45.412242 | orchestrator | 2025-09-23 19:26:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:45.413508 | orchestrator | 2025-09-23 19:26:45 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:45.413605 | orchestrator | 2025-09-23 19:26:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:48.598736 | orchestrator | 2025-09-23 19:26:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:48.599946 | orchestrator | 2025-09-23 19:26:48 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:48.602101 | orchestrator | 2025-09-23 19:26:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:48.603894 | orchestrator | 2025-09-23 19:26:48 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:48.603938 | orchestrator | 2025-09-23 19:26:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:51.645291 | orchestrator | 2025-09-23 19:26:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:51.645840 | orchestrator | 2025-09-23 19:26:51 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:51.647952 | orchestrator | 2025-09-23 19:26:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:51.648924 | orchestrator | 2025-09-23 19:26:51 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:51.648949 | orchestrator | 2025-09-23 19:26:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:54.693532 | orchestrator | 2025-09-23 19:26:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:54.694171 | orchestrator | 2025-09-23 19:26:54 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:54.695174 | orchestrator | 2025-09-23 19:26:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:54.696430 | orchestrator | 2025-09-23 19:26:54 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:54.696643 | orchestrator | 2025-09-23 19:26:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:26:57.742259 | orchestrator | 2025-09-23 19:26:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:26:57.742555 | orchestrator | 2025-09-23 19:26:57 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:26:57.744164 | orchestrator | 2025-09-23 19:26:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:26:57.745902 | orchestrator | 2025-09-23 19:26:57 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:26:57.745926 | orchestrator | 2025-09-23 19:26:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:00.795283 | orchestrator | 2025-09-23 19:27:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:00.798256 | orchestrator | 2025-09-23 19:27:00 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:00.799881 | orchestrator | 2025-09-23 19:27:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:00.801539 | orchestrator | 2025-09-23 19:27:00 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:00.801958 | orchestrator | 2025-09-23 19:27:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:03.832204 | orchestrator | 2025-09-23 19:27:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:03.832500 | orchestrator | 2025-09-23 19:27:03 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:03.833276 | orchestrator | 2025-09-23 19:27:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:03.833897 | orchestrator | 2025-09-23 19:27:03 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:03.834305 | orchestrator | 2025-09-23 19:27:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:06.865767 | orchestrator | 2025-09-23 19:27:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:06.866275 | orchestrator | 2025-09-23 19:27:06 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:06.866833 | orchestrator | 2025-09-23 19:27:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:06.867280 | orchestrator | 2025-09-23 19:27:06 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:06.867724 | orchestrator | 2025-09-23 19:27:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:10.190358 | orchestrator | 2025-09-23 19:27:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:10.190439 | orchestrator | 2025-09-23 19:27:09 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:10.190454 | orchestrator | 2025-09-23 19:27:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:10.190466 | orchestrator | 2025-09-23 19:27:09 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:10.190476 | orchestrator | 2025-09-23 19:27:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:12.928826 | orchestrator | 2025-09-23 19:27:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:12.929913 | orchestrator | 2025-09-23 19:27:12 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:12.932209 | orchestrator | 2025-09-23 19:27:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:12.933928 | orchestrator | 2025-09-23 19:27:12 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:12.934112 | orchestrator | 2025-09-23 19:27:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:15.983285 | orchestrator | 2025-09-23 19:27:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:15.984174 | orchestrator | 2025-09-23 19:27:15 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:15.986276 | orchestrator | 2025-09-23 19:27:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:15.988680 | orchestrator | 2025-09-23 19:27:15 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:15.988704 | orchestrator | 2025-09-23 19:27:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:19.041556 | orchestrator | 2025-09-23 19:27:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:19.044008 | orchestrator | 2025-09-23 19:27:19 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:19.047735 | orchestrator | 2025-09-23 19:27:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:19.049555 | orchestrator | 2025-09-23 19:27:19 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:19.049585 | orchestrator | 2025-09-23 19:27:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:22.097608 | orchestrator | 2025-09-23 19:27:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:22.099112 | orchestrator | 2025-09-23 19:27:22 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:22.101439 | orchestrator | 2025-09-23 19:27:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:22.104274 | orchestrator | 2025-09-23 19:27:22 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:22.104459 | orchestrator | 2025-09-23 19:27:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:25.150248 | orchestrator | 2025-09-23 19:27:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:25.151282 | orchestrator | 2025-09-23 19:27:25 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:25.152998 | orchestrator | 2025-09-23 19:27:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:25.154507 | orchestrator | 2025-09-23 19:27:25 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:25.154531 | orchestrator | 2025-09-23 19:27:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:28.184469 | orchestrator | 2025-09-23 19:27:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:28.186704 | orchestrator | 2025-09-23 19:27:28 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:28.188718 | orchestrator | 2025-09-23 19:27:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:28.190438 | orchestrator | 2025-09-23 19:27:28 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:28.190473 | orchestrator | 2025-09-23 19:27:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:31.238249 | orchestrator | 2025-09-23 19:27:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:31.239753 | orchestrator | 2025-09-23 19:27:31 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:31.241206 | orchestrator | 2025-09-23 19:27:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:31.242781 | orchestrator | 2025-09-23 19:27:31 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:31.242800 | orchestrator | 2025-09-23 19:27:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:34.287120 | orchestrator | 2025-09-23 19:27:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:34.288701 | orchestrator | 2025-09-23 19:27:34 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:34.290357 | orchestrator | 2025-09-23 19:27:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:34.291640 | orchestrator | 2025-09-23 19:27:34 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:34.292046 | orchestrator | 2025-09-23 19:27:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:37.328533 | orchestrator | 2025-09-23 19:27:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:37.329460 | orchestrator | 2025-09-23 19:27:37 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:37.331373 | orchestrator | 2025-09-23 19:27:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:37.332597 | orchestrator | 2025-09-23 19:27:37 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:37.332701 | orchestrator | 2025-09-23 19:27:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:40.371795 | orchestrator | 2025-09-23 19:27:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:40.372095 | orchestrator | 2025-09-23 19:27:40 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:40.372859 | orchestrator | 2025-09-23 19:27:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:40.373579 | orchestrator | 2025-09-23 19:27:40 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:40.373587 | orchestrator | 2025-09-23 19:27:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:43.409752 | orchestrator | 2025-09-23 19:27:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:43.411204 | orchestrator | 2025-09-23 19:27:43 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:43.413400 | orchestrator | 2025-09-23 19:27:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:43.415179 | orchestrator | 2025-09-23 19:27:43 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:43.415397 | orchestrator | 2025-09-23 19:27:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:46.455419 | orchestrator | 2025-09-23 19:27:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:46.455668 | orchestrator | 2025-09-23 19:27:46 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:46.456573 | orchestrator | 2025-09-23 19:27:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:46.457649 | orchestrator | 2025-09-23 19:27:46 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:46.457664 | orchestrator | 2025-09-23 19:27:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:49.499526 | orchestrator | 2025-09-23 19:27:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:49.501152 | orchestrator | 2025-09-23 19:27:49 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:49.504841 | orchestrator | 2025-09-23 19:27:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:49.506668 | orchestrator | 2025-09-23 19:27:49 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state STARTED 2025-09-23 19:27:49.506705 | orchestrator | 2025-09-23 19:27:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:52.556595 | orchestrator | 2025-09-23 19:27:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:52.558196 | orchestrator | 2025-09-23 19:27:52 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:52.560903 | orchestrator | 2025-09-23 19:27:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:52.567961 | orchestrator | 2025-09-23 19:27:52 | INFO  | Task 6dd67121-df60-459d-9cc2-01402444399a is in state SUCCESS 2025-09-23 19:27:52.568357 | orchestrator | 2025-09-23 19:27:52.568369 | orchestrator | 2025-09-23 19:27:52.568373 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:27:52.568378 | orchestrator | 2025-09-23 19:27:52.568382 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:27:52.568386 | orchestrator | Tuesday 23 September 2025 19:25:39 +0000 (0:00:00.239) 0:00:00.239 ***** 2025-09-23 19:27:52.568390 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:27:52.568394 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:27:52.568398 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:27:52.568402 | orchestrator | 2025-09-23 19:27:52.568406 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:27:52.568409 | orchestrator | Tuesday 23 September 2025 19:25:39 +0000 (0:00:00.274) 0:00:00.513 ***** 2025-09-23 19:27:52.568413 | orchestrator | ok: [testbed-node-0] => (item=enable_octavia_True) 2025-09-23 19:27:52.568417 | orchestrator | ok: [testbed-node-1] => (item=enable_octavia_True) 2025-09-23 19:27:52.568421 | orchestrator | ok: [testbed-node-2] => (item=enable_octavia_True) 2025-09-23 19:27:52.568425 | orchestrator | 2025-09-23 19:27:52.568428 | orchestrator | PLAY [Apply role octavia] ****************************************************** 2025-09-23 19:27:52.568432 | orchestrator | 2025-09-23 19:27:52.568436 | orchestrator | TASK [octavia : include_tasks] ************************************************* 2025-09-23 19:27:52.568439 | orchestrator | Tuesday 23 September 2025 19:25:40 +0000 (0:00:00.364) 0:00:00.877 ***** 2025-09-23 19:27:52.568443 | orchestrator | included: /ansible/roles/octavia/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:27:52.568447 | orchestrator | 2025-09-23 19:27:52.568451 | orchestrator | TASK [service-ks-register : octavia | Creating services] *********************** 2025-09-23 19:27:52.568455 | orchestrator | Tuesday 23 September 2025 19:25:40 +0000 (0:00:00.539) 0:00:01.417 ***** 2025-09-23 19:27:52.568458 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (5 retries left). 2025-09-23 19:27:52.568462 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (4 retries left). 2025-09-23 19:27:52.568466 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (3 retries left). 2025-09-23 19:27:52.568470 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (2 retries left). 2025-09-23 19:27:52.568473 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (1 retries left). 2025-09-23 19:27:52.568502 | orchestrator | failed: [testbed-node-0] (item=octavia (load-balancer)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Octavia Load Balancing Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9876"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9876"}], "name": "octavia", "type": "load-balancer"}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:27:52.568510 | orchestrator | 2025-09-23 19:27:52.568516 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:27:52.568523 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-23 19:27:52.568530 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:27:52.568536 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:27:52.568543 | orchestrator | 2025-09-23 19:27:52.568548 | orchestrator | 2025-09-23 19:27:52.568554 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:27:52.568560 | orchestrator | Tuesday 23 September 2025 19:26:33 +0000 (0:00:53.212) 0:00:54.630 ***** 2025-09-23 19:27:52.568566 | orchestrator | =============================================================================== 2025-09-23 19:27:52.568573 | orchestrator | service-ks-register : octavia | Creating services ---------------------- 53.21s 2025-09-23 19:27:52.568578 | orchestrator | octavia : include_tasks ------------------------------------------------- 0.54s 2025-09-23 19:27:52.568584 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.36s 2025-09-23 19:27:52.568590 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.27s 2025-09-23 19:27:52.568596 | orchestrator | 2025-09-23 19:27:52.570587 | orchestrator | 2025-09-23 19:27:52.570604 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:27:52.570610 | orchestrator | 2025-09-23 19:27:52.570616 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:27:52.570622 | orchestrator | Tuesday 23 September 2025 19:26:38 +0000 (0:00:00.255) 0:00:00.255 ***** 2025-09-23 19:27:52.570629 | orchestrator | ok: [testbed-manager] 2025-09-23 19:27:52.570634 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:27:52.570640 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:27:52.570647 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:27:52.570652 | orchestrator | ok: [testbed-node-3] 2025-09-23 19:27:52.570658 | orchestrator | ok: [testbed-node-4] 2025-09-23 19:27:52.570664 | orchestrator | ok: [testbed-node-5] 2025-09-23 19:27:52.570670 | orchestrator | 2025-09-23 19:27:52.570676 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:27:52.570682 | orchestrator | Tuesday 23 September 2025 19:26:38 +0000 (0:00:00.740) 0:00:00.996 ***** 2025-09-23 19:27:52.570688 | orchestrator | ok: [testbed-manager] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570694 | orchestrator | ok: [testbed-node-0] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570701 | orchestrator | ok: [testbed-node-1] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570707 | orchestrator | ok: [testbed-node-2] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570713 | orchestrator | ok: [testbed-node-3] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570719 | orchestrator | ok: [testbed-node-4] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570725 | orchestrator | ok: [testbed-node-5] => (item=enable_prometheus_True) 2025-09-23 19:27:52.570730 | orchestrator | 2025-09-23 19:27:52.570737 | orchestrator | PLAY [Apply role prometheus] *************************************************** 2025-09-23 19:27:52.570742 | orchestrator | 2025-09-23 19:27:52.570748 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2025-09-23 19:27:52.570754 | orchestrator | Tuesday 23 September 2025 19:26:39 +0000 (0:00:00.663) 0:00:01.660 ***** 2025-09-23 19:27:52.570769 | orchestrator | included: /ansible/roles/prometheus/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:27:52.570776 | orchestrator | 2025-09-23 19:27:52.570782 | orchestrator | TASK [prometheus : Ensuring config directories exist] ************************** 2025-09-23 19:27:52.570788 | orchestrator | Tuesday 23 September 2025 19:26:40 +0000 (0:00:01.297) 0:00:02.957 ***** 2025-09-23 19:27:52.570796 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-23 19:27:52.570804 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570816 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570823 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570835 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570842 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.570849 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.570859 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570865 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570872 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.570881 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.570887 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.570897 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.570904 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.570914 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-23 19:27:52.570982 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.570990 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.570999 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571005 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571016 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571022 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571032 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571039 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571045 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571051 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571060 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571066 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571075 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571085 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571091 | orchestrator | 2025-09-23 19:27:52.571098 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2025-09-23 19:27:52.571322 | orchestrator | Tuesday 23 September 2025 19:26:44 +0000 (0:00:03.104) 0:00:06.061 ***** 2025-09-23 19:27:52.571329 | orchestrator | included: /ansible/roles/prometheus/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-23 19:27:52.571336 | orchestrator | 2025-09-23 19:27:52.571342 | orchestrator | TASK [service-cert-copy : prometheus | Copying over extra CA certificates] ***** 2025-09-23 19:27:52.571348 | orchestrator | Tuesday 23 September 2025 19:26:45 +0000 (0:00:01.211) 0:00:07.272 ***** 2025-09-23 19:27:52.571354 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-23 19:27:52.571361 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571371 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571377 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571388 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571399 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571405 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571412 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.571418 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571424 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571433 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571440 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571449 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571463 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571469 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571476 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571482 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571491 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-23 19:27:52.571499 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571513 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571521 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571527 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571534 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571538 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571542 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571548 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.571552 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571563 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571567 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.571571 | orchestrator | 2025-09-23 19:27:52.571574 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS certificate] *** 2025-09-23 19:27:52.571578 | orchestrator | Tuesday 23 September 2025 19:26:51 +0000 (0:00:05.868) 0:00:13.141 ***** 2025-09-23 19:27:52.571582 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-23 19:27:52.571586 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.571590 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.571596 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-23 19:27:52.571604 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.571635 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.571640 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.571644 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.571928 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.571951 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.571955 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:27:52.571962 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.571975 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.571986 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572393 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572403 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572409 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572416 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572422 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572431 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572444 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572469 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572475 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572482 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572488 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.572494 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.572500 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.572506 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.572512 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572519 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572531 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572540 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.572546 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572552 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572574 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572581 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.572587 | orchestrator | 2025-09-23 19:27:52.572593 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS key] *** 2025-09-23 19:27:52.572599 | orchestrator | Tuesday 23 September 2025 19:26:52 +0000 (0:00:01.510) 0:00:14.652 ***** 2025-09-23 19:27:52.572605 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-23 19:27:52.572612 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572618 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572633 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-23 19:27:52.572640 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572661 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572667 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572674 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572680 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572690 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572696 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:27:52.572705 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572711 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572717 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572737 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572744 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572750 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.572756 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.572762 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572768 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572778 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572786 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572792 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-23 19:27:52.572799 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.572819 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572826 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572832 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572838 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.572844 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572854 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572861 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572867 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.572876 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-23 19:27:52.572883 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572919 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-23 19:27:52.572927 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.572943 | orchestrator | 2025-09-23 19:27:52.572950 | orchestrator | TASK [prometheus : Copying over config.json files] ***************************** 2025-09-23 19:27:52.572956 | orchestrator | Tuesday 23 September 2025 19:26:54 +0000 (0:00:01.903) 0:00:16.556 ***** 2025-09-23 19:27:52.572962 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.572968 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-23 19:27:52.572979 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.572985 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.572991 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.572997 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.573019 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.573040 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.573047 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573058 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573064 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573071 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573080 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573086 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573107 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573114 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573123 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573130 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573135 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573142 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573152 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573172 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-23 19:27:52.573179 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573189 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573195 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573201 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.573207 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573216 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573222 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.573228 | orchestrator | 2025-09-23 19:27:52.573234 | orchestrator | TASK [prometheus : Find custom prometheus alert rules files] ******************* 2025-09-23 19:27:52.573240 | orchestrator | Tuesday 23 September 2025 19:27:00 +0000 (0:00:06.301) 0:00:22.857 ***** 2025-09-23 19:27:52.573246 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-23 19:27:52.573252 | orchestrator | 2025-09-23 19:27:52.573257 | orchestrator | TASK [prometheus : Copying over custom prometheus alert rules files] *********** 2025-09-23 19:27:52.573277 | orchestrator | Tuesday 23 September 2025 19:27:01 +0000 (0:00:00.864) 0:00:23.722 ***** 2025-09-23 19:27:52.573288 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573296 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573302 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573309 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573318 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573325 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573346 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573355 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573359 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573364 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573368 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.573372 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1080384, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5387938, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573379 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573393 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573400 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573404 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573408 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573412 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573416 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573428 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573444 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573449 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573453 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573457 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573461 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573465 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573470 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573477 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573490 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573495 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573499 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573503 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573507 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573512 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573519 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1080420, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5453506, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.573532 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573537 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573540 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573544 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573548 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573554 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573562 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573576 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573580 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573584 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573588 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573592 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573600 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1080362, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5195954, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.573604 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573617 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573622 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573626 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573630 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573633 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573642 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573646 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573660 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573664 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573668 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573672 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573676 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573684 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573688 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573701 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573706 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573710 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573713 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1080404, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.573719 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573725 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573729 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573743 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573747 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573751 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573757 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573768 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573777 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573785 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573795 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573801 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573808 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573814 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573824 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573833 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573840 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573850 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573857 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573863 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573870 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573879 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573888 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573894 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573904 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1080357, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.573910 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573917 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573928 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.573948 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573954 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573963 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573969 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573975 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.573984 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573990 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.573996 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574005 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574011 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574066 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574073 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574082 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574088 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574094 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574104 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574110 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574116 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1080386, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5397937, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574122 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574130 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574136 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574145 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574151 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574157 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-23 19:27:52.574166 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574172 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1080399, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5428314, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574178 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1080390, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5411196, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574184 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1080365, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5227935, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574192 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080416, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5445118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574199 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080351, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5163403, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574208 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1080439, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5478232, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574214 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1080409, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5439582, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574225 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1080358, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.517976, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574232 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1080353, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.516637, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574238 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1080396, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5421019, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574247 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1080392, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5417485, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574252 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1080436, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5474536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-23 19:27:52.574255 | orchestrator | 2025-09-23 19:27:52.574259 | orchestrator | TASK [prometheus : Find prometheus common config overrides] ******************** 2025-09-23 19:27:52.574263 | orchestrator | Tuesday 23 September 2025 19:27:24 +0000 (0:00:22.868) 0:00:46.591 ***** 2025-09-23 19:27:52.574267 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-23 19:27:52.574271 | orchestrator | 2025-09-23 19:27:52.574277 | orchestrator | TASK [prometheus : Find prometheus host config overrides] ********************** 2025-09-23 19:27:52.574281 | orchestrator | Tuesday 23 September 2025 19:27:25 +0000 (0:00:00.703) 0:00:47.295 ***** 2025-09-23 19:27:52.574287 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574291 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574295 | orchestrator | manager/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574299 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574303 | orchestrator | manager/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574307 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-23 19:27:52.574310 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574314 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574318 | orchestrator | node-0/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574322 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574325 | orchestrator | node-0/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574329 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:27:52.574333 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574336 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574340 | orchestrator | node-1/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574344 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574347 | orchestrator | node-1/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574351 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574355 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574358 | orchestrator | node-2/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574362 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574366 | orchestrator | node-2/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574369 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574373 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574377 | orchestrator | node-3/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574380 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574384 | orchestrator | node-3/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574388 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574392 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574395 | orchestrator | node-4/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574399 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574403 | orchestrator | node-4/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574406 | orchestrator | [WARNING]: Skipped 2025-09-23 19:27:52.574410 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574414 | orchestrator | node-5/prometheus.yml.d' path due to this access issue: 2025-09-23 19:27:52.574417 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-23 19:27:52.574421 | orchestrator | node-5/prometheus.yml.d' is not a directory 2025-09-23 19:27:52.574425 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-23 19:27:52.574428 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-23 19:27:52.574432 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-09-23 19:27:52.574436 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-09-23 19:27:52.574439 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-09-23 19:27:52.574443 | orchestrator | 2025-09-23 19:27:52.574447 | orchestrator | TASK [prometheus : Copying over prometheus config file] ************************ 2025-09-23 19:27:52.574450 | orchestrator | Tuesday 23 September 2025 19:27:26 +0000 (0:00:01.630) 0:00:48.926 ***** 2025-09-23 19:27:52.574456 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-23 19:27:52.574462 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574466 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-23 19:27:52.574470 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.574474 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-23 19:27:52.574477 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574481 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-23 19:27:52.574485 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.574488 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-23 19:27:52.574492 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574496 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-23 19:27:52.574500 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574519 | orchestrator | fatal: [testbed-manager]: FAILED! => {"msg": "{{ prometheus_blackbox_exporter_endpoints_default | selectattr('enabled', 'true') | map(attribute='endpoints') | flatten | union(prometheus_blackbox_exporter_endpoints_custom) | unique | select | list }}: [{'endpoints': ['aodh:os_endpoint:{{ aodh_public_endpoint }}', \"{{ ('aodh_internal:os_endpoint:' + aodh_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_aodh | bool }}'}, {'endpoints': ['barbican:os_endpoint:{{ barbican_public_endpoint }}', \"{{ ('barbican_internal:os_endpoint:' + barbican_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_barbican | bool }}'}, {'endpoints': ['blazar:os_endpoint:{{ blazar_public_base_endpoint }}', \"{{ ('blazar_internal:os_endpoint:' + blazar_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_blazar | bool }}'}, {'endpoints': ['ceph_rgw:http_2xx:{{ ceph_rgw_public_base_endpoint }}', \"{{ ('ceph_rgw_internal:http_2xx:' + ceph_rgw_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_ceph_rgw | bool }}'}, {'endpoints': ['cinder:os_endpoint:{{ cinder_public_base_endpoint }}', \"{{ ('cinder_internal:os_endpoint:' + cinder_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_cinder | bool }}'}, {'endpoints': ['cloudkitty:os_endpoint:{{ cloudkitty_public_endpoint }}', \"{{ ('cloudkitty_internal:os_endpoint:' + cloudkitty_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_cloudkitty | bool }}'}, {'endpoints': ['designate:os_endpoint:{{ designate_public_endpoint }}', \"{{ ('designate_internal:os_endpoint:' + designate_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_designate | bool }}'}, {'endpoints': ['glance:os_endpoint:{{ glance_public_endpoint }}', \"{{ ('glance_internal:os_endpoint:' + glance_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_glance | bool }}'}, {'endpoints': ['gnocchi:os_endpoint:{{ gnocchi_public_endpoint }}', \"{{ ('gnocchi_internal:os_endpoint:' + gnocchi_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_gnocchi | bool }}'}, {'endpoints': ['heat:os_endpoint:{{ heat_public_base_endpoint }}', \"{{ ('heat_internal:os_endpoint:' + heat_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\", 'heat_cfn:os_endpoint:{{ heat_cfn_public_base_endpoint }}', \"{{ ('heat_cfn_internal:os_endpoint:' + heat_cfn_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_heat | bool }}'}, {'endpoints': ['horizon:http_2xx:{{ horizon_public_endpoint }}', \"{{ ('horizon_internal:http_2xx:' + horizon_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_horizon | bool }}'}, {'endpoints': ['ironic:os_endpoint:{{ ironic_public_endpoint }}', \"{{ ('ironic_internal:os_endpoint:' + ironic_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\", 'ironic_inspector:os_endpoint:{{ ironic_inspector_public_endpoint }}', \"{{ ('ironic_inspector_internal:os_endpoint:' + ironic_inspector_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_ironic | bool }}'}, {'endpoints': ['keystone:os_endpoint:{{ keystone_public_url }}', \"{{ ('keystone_internal:os_endpoint:' + keystone_internal_url) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_keystone | bool }}'}, {'endpoints': ['magnum:os_endpoint:{{ magnum_public_base_endpoint }}', \"{{ ('magnum_internal:os_endpoint:' + magnum_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_magnum | bool }}'}, {'endpoints': ['manila:os_endpoint:{{ manila_public_base_endpoint }}', \"{{ ('manila_internal:os_endpoint:' + manila_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_manila | bool }}'}, {'endpoints': ['masakari:os_endpoint:{{ masakari_public_endpoint }}', \"{{ ('masakari_internal:os_endpoint:' + masakari_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_masakari | bool }}'}, {'endpoints': ['mistral:os_endpoint:{{ mistral_public_base_endpoint }}', \"{{ ('mistral_internal:os_endpoint:' + mistral_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_mistral | bool }}'}, {'endpoints': ['neutron:os_endpoint:{{ neutron_public_endpoint }}', \"{{ ('neutron_internal:os_endpoint:' + neutron_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_neutron | bool }}'}, {'endpoints': ['nova:os_endpoint:{{ nova_public_base_endpoint }}', \"{{ ('nova_internal:os_endpoint:' + nova_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_nova | bool }}'}, {'endpoints': ['octavia:os_endpoint:{{ octavia_public_endpoint }}', \"{{ ('octavia_internal:os_endpoint:' + octavia_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_octavia | bool }}'}, {'endpoints': ['placement:os_endpoint:{{ placement_public_endpoint }}', \"{{ ('placement_internal:os_endpoint:' + placement_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_placement | bool }}'}, {'endpoints': ['skyline_apiserver:os_endpoint:{{ skyline_apiserver_public_endpoint }}', \"{{ ('skyline_apiserver_internal:os_endpoint:' + skyline_apiserver_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\", 'skyline_console:os_endpoint:{{ skyline_console_public_endpoint }}', \"{{ ('skyline_console_internal:os_endpoint:' + skyline_console_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_skyline | bool }}'}, {'endpoints': ['swift:os_endpoint:{{ swift_public_base_endpoint }}', \"{{ ('swift_internal:os_endpoint:' + swift_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_swift | bool }}'}, {'endpoints': ['tacker:os_endpoint:{{ tacker_public_endpoint }}', \"{{ ('tacker_internal:os_endpoint:' + tacker_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_tacker | bool }}'}, {'endpoints': ['trove:os_endpoint:{{ trove_public_base_endpoint }}', \"{{ ('trove_internal:os_endpoint:' + trove_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_trove | bool }}'}, {'endpoints': ['venus:os_endpoint:{{ venus_public_endpoint }}', \"{{ ('venus_internal:os_endpoint:' + venus_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_venus | bool }}'}, {'endpoints': ['watcher:os_endpoint:{{ watcher_public_endpoint }}', \"{{ ('watcher_internal:os_endpoint:' + watcher_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_watcher | bool }}'}, {'endpoints': ['zun:os_endpoint:{{ zun_public_base_endpoint }}', \"{{ ('zun_internal:os_endpoint:' + zun_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_zun | bool }}'}, {'endpoints': \"{% set etcd_endpoints = [] %}{% for host in groups.get('etcd', []) %}{{ etcd_endpoints.append('etcd_' + host + ':http_2xx:' + hostvars[host]['etcd_protocol'] + '://' + ('api' | kolla_address(host) | put_address_in_context('url')) + ':' + hostvars[host]['etcd_client_port'] + '/metrics')}}{% endfor %}{{ etcd_endpoints }}\", 'enabled': '{{ enable_etcd | bool }}'}, {'endpoints': ['grafana:http_2xx:{{ grafana_public_endpoint }}', \"{{ ('grafana_internal:http_2xx:' + grafana_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_grafana | bool }}'}, {'endpoints': ['opensearch:http_2xx:{{ opensearch_internal_endpoint }}'], 'enabled': '{{ enable_opensearch | bool }}'}, {'endpoints': ['opensearch_dashboards:http_2xx_opensearch_dashboards:{{ opensearch_dashboards_internal_endpoint }}/api/status'], 'enabled': '{{ enable_opensearch_dashboards | bool }}'}, {'endpoints': ['opensearch_dashboards_external:http_2xx_opensearch_dashboards:{{ opensearch_dashboards_external_endpoint }}/api/status'], 'enabled': '{{ enable_opensearch_dashboards_external | bool }}'}, {'endpoints': ['prometheus:http_2xx_prometheus:{{ prometheus_public_endpoint if enable_prometheus_server_external else prometheus_internal_endpoint }}/-/healthy'], 'enabled': '{{ enable_prometheus | bool }}'}, {'endpoints': ['prometheus_alertmanager:http_2xx_alertmanager:{{ prometheus_alertmanager_public_endpoint if enable_prometheus_alertmanager_external else prometheus_alertmanager_internal_endpoint }}'], 'enabled': '{{ enable_prometheus_alertmanager | bool }}'}, {'endpoints': \"{% set rabbitmq_endpoints = [] %}{% for host in groups.get('rabbitmq', []) %}{{ rabbitmq_endpoints.append('rabbitmq_' + host + (':tls_connect:' if rabbitmq_enable_tls | bool else ':tcp_connect:') + ('api' | kolla_address(host) | put_address_in_context('url')) + ':' + hostvars[host]['rabbitmq_port'] ) }}{% endfor %}{{ rabbitmq_endpoints }}\", 'enabled': '{{ enable_rabbitmq | bool }}'}, {'endpoints': \"{% set redis_endpoints = [] %}{% for host in groups.get('redis', []) %}{{ redis_endpoints.append('redis_' + host + ':tcp_connect:' + ('api' | kolla_address(host) | put_address_in_context('url')) + ':' + hostvars[host]['redis_port']) }}{% endfor %}{{ redis_endpoints }}\", 'enabled': '{{ enable_redis | bool }}'}]: 'swift_public_base_endpoint' is undefined"} 2025-09-23 19:27:52.574531 | orchestrator | 2025-09-23 19:27:52.574535 | orchestrator | TASK [prometheus : Copying over prometheus web config file] ******************** 2025-09-23 19:27:52.574539 | orchestrator | Tuesday 23 September 2025 19:27:35 +0000 (0:00:08.495) 0:00:57.421 ***** 2025-09-23 19:27:52.574543 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-23 19:27:52.574546 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574550 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-23 19:27:52.574554 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.574557 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-23 19:27:52.574561 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.574565 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-23 19:27:52.574568 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574572 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-23 19:27:52.574576 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574582 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-23 19:27:52.574585 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574589 | orchestrator | 2025-09-23 19:27:52.574593 | orchestrator | TASK [prometheus : Copying over prometheus alertmanager config file] *********** 2025-09-23 19:27:52.574596 | orchestrator | Tuesday 23 September 2025 19:27:36 +0000 (0:00:01.095) 0:00:58.517 ***** 2025-09-23 19:27:52.574600 | orchestrator | skipping: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-23 19:27:52.574605 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574612 | orchestrator | skipping: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-23 19:27:52.574618 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.574624 | orchestrator | skipping: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-23 19:27:52.574630 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.574638 | orchestrator | skipping: [testbed-node-3] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-23 19:27:52.574645 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574651 | orchestrator | skipping: [testbed-node-4] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-23 19:27:52.574657 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574662 | orchestrator | skipping: [testbed-node-5] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-23 19:27:52.574668 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574674 | orchestrator | 2025-09-23 19:27:52.574680 | orchestrator | TASK [prometheus : Find custom Alertmanager alert notification templates] ****** 2025-09-23 19:27:52.574686 | orchestrator | Tuesday 23 September 2025 19:27:37 +0000 (0:00:00.885) 0:00:59.402 ***** 2025-09-23 19:27:52.574692 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:27:52.574698 | orchestrator | 2025-09-23 19:27:52.574704 | orchestrator | TASK [prometheus : Copying over custom Alertmanager alert notification templates] *** 2025-09-23 19:27:52.574710 | orchestrator | Tuesday 23 September 2025 19:27:37 +0000 (0:00:00.595) 0:00:59.998 ***** 2025-09-23 19:27:52.574716 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574722 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.574728 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.574734 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574740 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574746 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574751 | orchestrator | 2025-09-23 19:27:52.574761 | orchestrator | TASK [prometheus : Copying over my.cnf for mysqld_exporter] ******************** 2025-09-23 19:27:52.574767 | orchestrator | Tuesday 23 September 2025 19:27:38 +0000 (0:00:00.613) 0:01:00.612 ***** 2025-09-23 19:27:52.574773 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574779 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574785 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574791 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:27:52.574796 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:27:52.574802 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:27:52.574808 | orchestrator | 2025-09-23 19:27:52.574814 | orchestrator | TASK [prometheus : Copying cloud config file for openstack exporter] *********** 2025-09-23 19:27:52.574820 | orchestrator | Tuesday 23 September 2025 19:27:40 +0000 (0:00:01.452) 0:01:02.064 ***** 2025-09-23 19:27:52.574826 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-23 19:27:52.574832 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574838 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-23 19:27:52.574844 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.574853 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-23 19:27:52.574859 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.574865 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-23 19:27:52.574871 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574877 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-23 19:27:52.574883 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574888 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-23 19:27:52.574894 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.574900 | orchestrator | 2025-09-23 19:27:52.574906 | orchestrator | TASK [prometheus : Copying config file for blackbox exporter] ****************** 2025-09-23 19:27:52.574912 | orchestrator | Tuesday 23 September 2025 19:27:41 +0000 (0:00:01.225) 0:01:03.290 ***** 2025-09-23 19:27:52.574918 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-23 19:27:52.574924 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.574941 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-23 19:27:52.574947 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.574953 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-23 19:27:52.574959 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.574965 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-23 19:27:52.574971 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.574977 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-23 19:27:52.574983 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.574989 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-23 19:27:52.574994 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.575000 | orchestrator | 2025-09-23 19:27:52.575006 | orchestrator | TASK [prometheus : Find extra prometheus server config files] ****************** 2025-09-23 19:27:52.575012 | orchestrator | Tuesday 23 September 2025 19:27:42 +0000 (0:00:01.235) 0:01:04.526 ***** 2025-09-23 19:27:52.575019 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.575024 | orchestrator | 2025-09-23 19:27:52.575030 | orchestrator | TASK [prometheus : Create subdirectories for extra config files] *************** 2025-09-23 19:27:52.575036 | orchestrator | Tuesday 23 September 2025 19:27:43 +0000 (0:00:00.835) 0:01:05.361 ***** 2025-09-23 19:27:52.575042 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.575048 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.575054 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.575062 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.575068 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.575074 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.575079 | orchestrator | 2025-09-23 19:27:52.575085 | orchestrator | TASK [prometheus : Template extra prometheus server config files] ************** 2025-09-23 19:27:52.575091 | orchestrator | Tuesday 23 September 2025 19:27:43 +0000 (0:00:00.575) 0:01:05.936 ***** 2025-09-23 19:27:52.575097 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:27:52.575103 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:27:52.575109 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:27:52.575115 | orchestrator | skipping: [testbed-node-3] 2025-09-23 19:27:52.575121 | orchestrator | skipping: [testbed-node-4] 2025-09-23 19:27:52.575126 | orchestrator | skipping: [testbed-node-5] 2025-09-23 19:27:52.575132 | orchestrator | 2025-09-23 19:27:52.575138 | orchestrator | TASK [prometheus : Check prometheus containers] ******************************** 2025-09-23 19:27:52.575148 | orchestrator | Tuesday 23 September 2025 19:27:44 +0000 (0:00:00.723) 0:01:06.660 ***** 2025-09-23 19:27:52.575157 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.575164 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.575170 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.575177 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.575183 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.575189 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-23 19:27:52.575198 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575204 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575216 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575225 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575232 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575238 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575244 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575251 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575259 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575269 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575275 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575284 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575291 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575298 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575304 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-23 19:27:52.575310 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575318 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575329 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-23 19:27:52.575335 | orchestrator | 2025-09-23 19:27:52.575341 | orchestrator | TASK [prometheus : Creating prometheus database user and setting permissions] *** 2025-09-23 19:27:52.575347 | orchestrator | Tuesday 23 September 2025 19:27:48 +0000 (0:00:03.634) 0:01:10.294 ***** 2025-09-23 19:27:52.575354 | orchestrator | failed: [testbed-node-0] (item=testbed-node-0) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "0", "value": {"hosts": ["testbed-node-0", "testbed-node-1", "testbed-node-2"]}}, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:27:52.575360 | orchestrator | 2025-09-23 19:27:52.575369 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:27:52.575376 | orchestrator | testbed-manager : ok=11  changed=4  unreachable=0 failed=1  skipped=2  rescued=0 ignored=0 2025-09-23 19:27:52.575382 | orchestrator | testbed-node-0 : ok=11  changed=5  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2025-09-23 19:27:52.575388 | orchestrator | testbed-node-1 : ok=10  changed=5  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2025-09-23 19:27:52.575394 | orchestrator | testbed-node-2 : ok=10  changed=5  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2025-09-23 19:27:52.575400 | orchestrator | testbed-node-3 : ok=9  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2025-09-23 19:27:52.575406 | orchestrator | testbed-node-4 : ok=9  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2025-09-23 19:27:52.575412 | orchestrator | testbed-node-5 : ok=9  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2025-09-23 19:27:52.575418 | orchestrator | 2025-09-23 19:27:52.575425 | orchestrator | 2025-09-23 19:27:52.575431 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:27:52.575437 | orchestrator | Tuesday 23 September 2025 19:27:49 +0000 (0:00:01.530) 0:01:11.825 ***** 2025-09-23 19:27:52.575443 | orchestrator | =============================================================================== 2025-09-23 19:27:52.575449 | orchestrator | prometheus : Copying over custom prometheus alert rules files ---------- 22.87s 2025-09-23 19:27:52.575454 | orchestrator | prometheus : Copying over prometheus config file ------------------------ 8.50s 2025-09-23 19:27:52.575461 | orchestrator | prometheus : Copying over config.json files ----------------------------- 6.30s 2025-09-23 19:27:52.575467 | orchestrator | service-cert-copy : prometheus | Copying over extra CA certificates ----- 5.87s 2025-09-23 19:27:52.575473 | orchestrator | prometheus : Check prometheus containers -------------------------------- 3.63s 2025-09-23 19:27:52.575479 | orchestrator | prometheus : Ensuring config directories exist -------------------------- 3.10s 2025-09-23 19:27:52.575484 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS key --- 1.90s 2025-09-23 19:27:52.575490 | orchestrator | prometheus : Find prometheus host config overrides ---------------------- 1.63s 2025-09-23 19:27:52.575499 | orchestrator | prometheus : Creating prometheus database user and setting permissions --- 1.53s 2025-09-23 19:27:52.575505 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS certificate --- 1.51s 2025-09-23 19:27:52.575511 | orchestrator | prometheus : Copying over my.cnf for mysqld_exporter -------------------- 1.45s 2025-09-23 19:27:52.575517 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.30s 2025-09-23 19:27:52.575523 | orchestrator | prometheus : Copying config file for blackbox exporter ------------------ 1.24s 2025-09-23 19:27:52.575529 | orchestrator | prometheus : Copying cloud config file for openstack exporter ----------- 1.23s 2025-09-23 19:27:52.575535 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.21s 2025-09-23 19:27:52.575541 | orchestrator | prometheus : Copying over prometheus web config file -------------------- 1.10s 2025-09-23 19:27:52.575547 | orchestrator | prometheus : Copying over prometheus alertmanager config file ----------- 0.89s 2025-09-23 19:27:52.575553 | orchestrator | prometheus : Find custom prometheus alert rules files ------------------- 0.86s 2025-09-23 19:27:52.575561 | orchestrator | prometheus : Find extra prometheus server config files ------------------ 0.84s 2025-09-23 19:27:52.575567 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.74s 2025-09-23 19:27:52.575573 | orchestrator | 2025-09-23 19:27:52 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:27:52.575579 | orchestrator | 2025-09-23 19:27:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:55.616397 | orchestrator | 2025-09-23 19:27:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:55.616993 | orchestrator | 2025-09-23 19:27:55 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:55.617873 | orchestrator | 2025-09-23 19:27:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:55.618620 | orchestrator | 2025-09-23 19:27:55 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:27:55.618635 | orchestrator | 2025-09-23 19:27:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:27:58.661891 | orchestrator | 2025-09-23 19:27:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:27:58.663468 | orchestrator | 2025-09-23 19:27:58 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:27:58.665402 | orchestrator | 2025-09-23 19:27:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:27:58.666776 | orchestrator | 2025-09-23 19:27:58 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:27:58.667002 | orchestrator | 2025-09-23 19:27:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:01.700816 | orchestrator | 2025-09-23 19:28:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:01.703957 | orchestrator | 2025-09-23 19:28:01 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:28:01.706473 | orchestrator | 2025-09-23 19:28:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:01.709133 | orchestrator | 2025-09-23 19:28:01 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:01.709142 | orchestrator | 2025-09-23 19:28:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:04.749389 | orchestrator | 2025-09-23 19:28:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:04.751418 | orchestrator | 2025-09-23 19:28:04 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state STARTED 2025-09-23 19:28:04.753676 | orchestrator | 2025-09-23 19:28:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:04.755515 | orchestrator | 2025-09-23 19:28:04 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:04.755789 | orchestrator | 2025-09-23 19:28:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:07.795369 | orchestrator | 2025-09-23 19:28:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:07.796421 | orchestrator | 2025-09-23 19:28:07 | INFO  | Task 7e6293bb-12e9-4395-bcdf-b4238791a6c4 is in state SUCCESS 2025-09-23 19:28:07.798474 | orchestrator | 2025-09-23 19:28:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:07.800220 | orchestrator | 2025-09-23 19:28:07 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:07.800260 | orchestrator | 2025-09-23 19:28:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:10.835307 | orchestrator | 2025-09-23 19:28:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:10.839700 | orchestrator | 2025-09-23 19:28:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:10.841800 | orchestrator | 2025-09-23 19:28:10 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:10.843695 | orchestrator | 2025-09-23 19:28:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:13.884673 | orchestrator | 2025-09-23 19:28:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:13.885073 | orchestrator | 2025-09-23 19:28:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:13.886982 | orchestrator | 2025-09-23 19:28:13 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:13.887065 | orchestrator | 2025-09-23 19:28:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:16.924255 | orchestrator | 2025-09-23 19:28:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:16.925840 | orchestrator | 2025-09-23 19:28:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:16.927376 | orchestrator | 2025-09-23 19:28:16 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:16.927412 | orchestrator | 2025-09-23 19:28:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:19.961305 | orchestrator | 2025-09-23 19:28:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:19.962429 | orchestrator | 2025-09-23 19:28:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:19.963844 | orchestrator | 2025-09-23 19:28:19 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:19.963870 | orchestrator | 2025-09-23 19:28:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:22.995703 | orchestrator | 2025-09-23 19:28:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:22.997305 | orchestrator | 2025-09-23 19:28:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:22.998807 | orchestrator | 2025-09-23 19:28:22 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:22.998989 | orchestrator | 2025-09-23 19:28:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:26.039589 | orchestrator | 2025-09-23 19:28:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:26.044103 | orchestrator | 2025-09-23 19:28:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:26.047047 | orchestrator | 2025-09-23 19:28:26 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:26.047156 | orchestrator | 2025-09-23 19:28:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:29.087572 | orchestrator | 2025-09-23 19:28:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:29.090409 | orchestrator | 2025-09-23 19:28:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:29.092107 | orchestrator | 2025-09-23 19:28:29 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:29.092229 | orchestrator | 2025-09-23 19:28:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:32.132643 | orchestrator | 2025-09-23 19:28:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:32.135032 | orchestrator | 2025-09-23 19:28:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:32.137600 | orchestrator | 2025-09-23 19:28:32 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:32.137630 | orchestrator | 2025-09-23 19:28:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:35.175279 | orchestrator | 2025-09-23 19:28:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:35.176829 | orchestrator | 2025-09-23 19:28:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:35.178209 | orchestrator | 2025-09-23 19:28:35 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:35.178247 | orchestrator | 2025-09-23 19:28:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:38.211059 | orchestrator | 2025-09-23 19:28:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:38.213861 | orchestrator | 2025-09-23 19:28:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:38.216296 | orchestrator | 2025-09-23 19:28:38 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:38.216323 | orchestrator | 2025-09-23 19:28:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:41.261383 | orchestrator | 2025-09-23 19:28:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:41.263690 | orchestrator | 2025-09-23 19:28:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:41.265830 | orchestrator | 2025-09-23 19:28:41 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state STARTED 2025-09-23 19:28:41.266102 | orchestrator | 2025-09-23 19:28:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:44.308953 | orchestrator | 2025-09-23 19:28:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:44.310475 | orchestrator | 2025-09-23 19:28:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:44.314467 | orchestrator | 2025-09-23 19:28:44 | INFO  | Task 52510651-5afa-4dce-8d76-86ebe4293539 is in state SUCCESS 2025-09-23 19:28:44.314850 | orchestrator | 2025-09-23 19:28:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:44.316318 | orchestrator | 2025-09-23 19:28:44.316350 | orchestrator | 2025-09-23 19:28:44.316362 | orchestrator | PLAY [Bootstraph ceph dashboard] *********************************************** 2025-09-23 19:28:44.316395 | orchestrator | 2025-09-23 19:28:44.316407 | orchestrator | TASK [Disable the ceph dashboard] ********************************************** 2025-09-23 19:28:44.316418 | orchestrator | Tuesday 23 September 2025 19:26:38 +0000 (0:00:00.242) 0:00:00.242 ***** 2025-09-23 19:28:44.316429 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316441 | orchestrator | 2025-09-23 19:28:44.316452 | orchestrator | TASK [Set mgr/dashboard/ssl to false] ****************************************** 2025-09-23 19:28:44.316462 | orchestrator | Tuesday 23 September 2025 19:26:40 +0000 (0:00:01.987) 0:00:02.229 ***** 2025-09-23 19:28:44.316473 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316484 | orchestrator | 2025-09-23 19:28:44.316494 | orchestrator | TASK [Set mgr/dashboard/server_port to 7000] *********************************** 2025-09-23 19:28:44.316505 | orchestrator | Tuesday 23 September 2025 19:26:41 +0000 (0:00:00.972) 0:00:03.201 ***** 2025-09-23 19:28:44.316516 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316526 | orchestrator | 2025-09-23 19:28:44.316602 | orchestrator | TASK [Set mgr/dashboard/server_addr to 0.0.0.0] ******************************** 2025-09-23 19:28:44.316679 | orchestrator | Tuesday 23 September 2025 19:26:42 +0000 (0:00:01.122) 0:00:04.324 ***** 2025-09-23 19:28:44.316694 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316705 | orchestrator | 2025-09-23 19:28:44.316716 | orchestrator | TASK [Set mgr/dashboard/standby_behaviour to error] **************************** 2025-09-23 19:28:44.316727 | orchestrator | Tuesday 23 September 2025 19:26:43 +0000 (0:00:01.042) 0:00:05.366 ***** 2025-09-23 19:28:44.316737 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316815 | orchestrator | 2025-09-23 19:28:44.316827 | orchestrator | TASK [Set mgr/dashboard/standby_error_status_code to 404] ********************** 2025-09-23 19:28:44.316837 | orchestrator | Tuesday 23 September 2025 19:26:44 +0000 (0:00:00.944) 0:00:06.311 ***** 2025-09-23 19:28:44.316848 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316859 | orchestrator | 2025-09-23 19:28:44.316870 | orchestrator | TASK [Enable the ceph dashboard] *********************************************** 2025-09-23 19:28:44.316880 | orchestrator | Tuesday 23 September 2025 19:26:45 +0000 (0:00:00.990) 0:00:07.301 ***** 2025-09-23 19:28:44.316891 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.316960 | orchestrator | 2025-09-23 19:28:44.316973 | orchestrator | TASK [Write ceph_dashboard_password to temporary file] ************************* 2025-09-23 19:28:44.318186 | orchestrator | Tuesday 23 September 2025 19:26:46 +0000 (0:00:01.452) 0:00:08.753 ***** 2025-09-23 19:28:44.318267 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.318282 | orchestrator | 2025-09-23 19:28:44.318293 | orchestrator | TASK [Create admin user] ******************************************************* 2025-09-23 19:28:44.318304 | orchestrator | Tuesday 23 September 2025 19:26:48 +0000 (0:00:01.266) 0:00:10.020 ***** 2025-09-23 19:28:44.318314 | orchestrator | changed: [testbed-manager] 2025-09-23 19:28:44.318324 | orchestrator | 2025-09-23 19:28:44.318333 | orchestrator | TASK [Remove temporary file for ceph_dashboard_password] *********************** 2025-09-23 19:28:44.318343 | orchestrator | Tuesday 23 September 2025 19:27:40 +0000 (0:00:52.512) 0:01:02.533 ***** 2025-09-23 19:28:44.318353 | orchestrator | skipping: [testbed-manager] 2025-09-23 19:28:44.318363 | orchestrator | 2025-09-23 19:28:44.318372 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2025-09-23 19:28:44.318382 | orchestrator | 2025-09-23 19:28:44.318391 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2025-09-23 19:28:44.318401 | orchestrator | Tuesday 23 September 2025 19:27:40 +0000 (0:00:00.162) 0:01:02.695 ***** 2025-09-23 19:28:44.318410 | orchestrator | changed: [testbed-node-0] 2025-09-23 19:28:44.318420 | orchestrator | 2025-09-23 19:28:44.318430 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2025-09-23 19:28:44.318439 | orchestrator | 2025-09-23 19:28:44.318449 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2025-09-23 19:28:44.318458 | orchestrator | Tuesday 23 September 2025 19:27:52 +0000 (0:00:11.625) 0:01:14.321 ***** 2025-09-23 19:28:44.318468 | orchestrator | changed: [testbed-node-1] 2025-09-23 19:28:44.318502 | orchestrator | 2025-09-23 19:28:44.318512 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2025-09-23 19:28:44.318522 | orchestrator | 2025-09-23 19:28:44.318532 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2025-09-23 19:28:44.318541 | orchestrator | Tuesday 23 September 2025 19:27:53 +0000 (0:00:01.363) 0:01:15.685 ***** 2025-09-23 19:28:44.318551 | orchestrator | changed: [testbed-node-2] 2025-09-23 19:28:44.318560 | orchestrator | 2025-09-23 19:28:44.318570 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:28:44.318581 | orchestrator | testbed-manager : ok=9  changed=9  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-23 19:28:44.318591 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:28:44.318601 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:28:44.318621 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-23 19:28:44.318631 | orchestrator | 2025-09-23 19:28:44.318641 | orchestrator | 2025-09-23 19:28:44.318651 | orchestrator | 2025-09-23 19:28:44.318660 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:28:44.318670 | orchestrator | Tuesday 23 September 2025 19:28:04 +0000 (0:00:11.181) 0:01:26.866 ***** 2025-09-23 19:28:44.318680 | orchestrator | =============================================================================== 2025-09-23 19:28:44.318689 | orchestrator | Create admin user ------------------------------------------------------ 52.51s 2025-09-23 19:28:44.318699 | orchestrator | Restart ceph manager service ------------------------------------------- 24.17s 2025-09-23 19:28:44.318787 | orchestrator | Disable the ceph dashboard ---------------------------------------------- 1.99s 2025-09-23 19:28:44.318799 | orchestrator | Enable the ceph dashboard ----------------------------------------------- 1.45s 2025-09-23 19:28:44.318809 | orchestrator | Write ceph_dashboard_password to temporary file ------------------------- 1.27s 2025-09-23 19:28:44.318819 | orchestrator | Set mgr/dashboard/server_port to 7000 ----------------------------------- 1.12s 2025-09-23 19:28:44.318828 | orchestrator | Set mgr/dashboard/server_addr to 0.0.0.0 -------------------------------- 1.04s 2025-09-23 19:28:44.318838 | orchestrator | Set mgr/dashboard/standby_error_status_code to 404 ---------------------- 0.99s 2025-09-23 19:28:44.318847 | orchestrator | Set mgr/dashboard/ssl to false ------------------------------------------ 0.97s 2025-09-23 19:28:44.318857 | orchestrator | Set mgr/dashboard/standby_behaviour to error ---------------------------- 0.94s 2025-09-23 19:28:44.318866 | orchestrator | Remove temporary file for ceph_dashboard_password ----------------------- 0.16s 2025-09-23 19:28:44.318876 | orchestrator | 2025-09-23 19:28:44.318885 | orchestrator | 2025-09-23 19:28:44.318925 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-23 19:28:44.318936 | orchestrator | 2025-09-23 19:28:44.318946 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-23 19:28:44.318955 | orchestrator | Tuesday 23 September 2025 19:27:53 +0000 (0:00:00.235) 0:00:00.235 ***** 2025-09-23 19:28:44.318965 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:28:44.318975 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:28:44.318984 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:28:44.318994 | orchestrator | 2025-09-23 19:28:44.319003 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-23 19:28:44.319013 | orchestrator | Tuesday 23 September 2025 19:27:53 +0000 (0:00:00.265) 0:00:00.501 ***** 2025-09-23 19:28:44.319022 | orchestrator | ok: [testbed-node-0] => (item=enable_grafana_True) 2025-09-23 19:28:44.319032 | orchestrator | ok: [testbed-node-1] => (item=enable_grafana_True) 2025-09-23 19:28:44.319042 | orchestrator | ok: [testbed-node-2] => (item=enable_grafana_True) 2025-09-23 19:28:44.319059 | orchestrator | 2025-09-23 19:28:44.319085 | orchestrator | PLAY [Apply role grafana] ****************************************************** 2025-09-23 19:28:44.319109 | orchestrator | 2025-09-23 19:28:44.319126 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2025-09-23 19:28:44.319141 | orchestrator | Tuesday 23 September 2025 19:27:54 +0000 (0:00:00.341) 0:00:00.842 ***** 2025-09-23 19:28:44.319157 | orchestrator | included: /ansible/roles/grafana/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:28:44.319173 | orchestrator | 2025-09-23 19:28:44.319190 | orchestrator | TASK [grafana : Ensuring config directories exist] ***************************** 2025-09-23 19:28:44.319207 | orchestrator | Tuesday 23 September 2025 19:27:54 +0000 (0:00:00.466) 0:00:01.308 ***** 2025-09-23 19:28:44.319226 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319247 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319265 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319275 | orchestrator | 2025-09-23 19:28:44.319285 | orchestrator | TASK [grafana : Check if extra configuration file exists] ********************** 2025-09-23 19:28:44.319295 | orchestrator | Tuesday 23 September 2025 19:27:55 +0000 (0:00:00.719) 0:00:02.028 ***** 2025-09-23 19:28:44.319345 | orchestrator | [WARNING]: Skipped '/operations/prometheus/grafana' path due to this access 2025-09-23 19:28:44.319356 | orchestrator | issue: '/operations/prometheus/grafana' is not a directory 2025-09-23 19:28:44.319366 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:28:44.319375 | orchestrator | 2025-09-23 19:28:44.319385 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2025-09-23 19:28:44.319394 | orchestrator | Tuesday 23 September 2025 19:27:56 +0000 (0:00:00.737) 0:00:02.765 ***** 2025-09-23 19:28:44.319404 | orchestrator | included: /ansible/roles/grafana/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-23 19:28:44.319414 | orchestrator | 2025-09-23 19:28:44.319423 | orchestrator | TASK [service-cert-copy : grafana | Copying over extra CA certificates] ******** 2025-09-23 19:28:44.319433 | orchestrator | Tuesday 23 September 2025 19:27:56 +0000 (0:00:00.561) 0:00:03.326 ***** 2025-09-23 19:28:44.319443 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319462 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319473 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319483 | orchestrator | 2025-09-23 19:28:44.319492 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS certificate] *** 2025-09-23 19:28:44.319502 | orchestrator | Tuesday 23 September 2025 19:27:57 +0000 (0:00:01.268) 0:00:04.595 ***** 2025-09-23 19:28:44.319512 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:28:44.319522 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:28:44.319562 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:28:44.319574 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:28:44.319584 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:28:44.319599 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:28:44.319609 | orchestrator | 2025-09-23 19:28:44.319619 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS key] ***** 2025-09-23 19:28:44.319629 | orchestrator | Tuesday 23 September 2025 19:27:58 +0000 (0:00:00.311) 0:00:04.906 ***** 2025-09-23 19:28:44.319639 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:28:44.319649 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:28:44.319659 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:28:44.319669 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:28:44.319679 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-23 19:28:44.319689 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:28:44.319698 | orchestrator | 2025-09-23 19:28:44.319708 | orchestrator | TASK [grafana : Copying over config.json files] ******************************** 2025-09-23 19:28:44.319718 | orchestrator | Tuesday 23 September 2025 19:27:59 +0000 (0:00:00.753) 0:00:05.660 ***** 2025-09-23 19:28:44.319731 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319766 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319783 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319794 | orchestrator | 2025-09-23 19:28:44.319803 | orchestrator | TASK [grafana : Copying over grafana.ini] ************************************** 2025-09-23 19:28:44.319813 | orchestrator | Tuesday 23 September 2025 19:28:00 +0000 (0:00:01.194) 0:00:06.855 ***** 2025-09-23 19:28:44.319823 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319833 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319844 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.319854 | orchestrator | 2025-09-23 19:28:44.319864 | orchestrator | TASK [grafana : Copying over extra configuration file] ************************* 2025-09-23 19:28:44.319873 | orchestrator | Tuesday 23 September 2025 19:28:01 +0000 (0:00:01.266) 0:00:08.121 ***** 2025-09-23 19:28:44.319883 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:28:44.319892 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:28:44.319924 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:28:44.319933 | orchestrator | 2025-09-23 19:28:44.319953 | orchestrator | TASK [grafana : Configuring Prometheus as data source for Grafana] ************* 2025-09-23 19:28:44.319963 | orchestrator | Tuesday 23 September 2025 19:28:01 +0000 (0:00:00.364) 0:00:08.486 ***** 2025-09-23 19:28:44.319972 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2025-09-23 19:28:44.319982 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2025-09-23 19:28:44.319991 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2025-09-23 19:28:44.320001 | orchestrator | 2025-09-23 19:28:44.320034 | orchestrator | TASK [grafana : Configuring dashboards provisioning] *************************** 2025-09-23 19:28:44.320045 | orchestrator | Tuesday 23 September 2025 19:28:03 +0000 (0:00:01.221) 0:00:09.707 ***** 2025-09-23 19:28:44.320055 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2025-09-23 19:28:44.320065 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2025-09-23 19:28:44.320075 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2025-09-23 19:28:44.320084 | orchestrator | 2025-09-23 19:28:44.320094 | orchestrator | TASK [grafana : Find custom grafana dashboards] ******************************** 2025-09-23 19:28:44.320103 | orchestrator | Tuesday 23 September 2025 19:28:04 +0000 (0:00:01.224) 0:00:10.932 ***** 2025-09-23 19:28:44.320113 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-23 19:28:44.320122 | orchestrator | 2025-09-23 19:28:44.320132 | orchestrator | TASK [grafana : Find templated grafana dashboards] ***************************** 2025-09-23 19:28:44.320141 | orchestrator | Tuesday 23 September 2025 19:28:05 +0000 (0:00:00.684) 0:00:11.617 ***** 2025-09-23 19:28:44.320151 | orchestrator | [WARNING]: Skipped '/etc/kolla/grafana/dashboards' path due to this access 2025-09-23 19:28:44.320161 | orchestrator | issue: '/etc/kolla/grafana/dashboards' is not a directory 2025-09-23 19:28:44.320170 | orchestrator | ok: [testbed-node-0] 2025-09-23 19:28:44.320180 | orchestrator | ok: [testbed-node-1] 2025-09-23 19:28:44.320189 | orchestrator | ok: [testbed-node-2] 2025-09-23 19:28:44.320198 | orchestrator | 2025-09-23 19:28:44.320208 | orchestrator | TASK [grafana : Prune templated Grafana dashboards] **************************** 2025-09-23 19:28:44.320218 | orchestrator | Tuesday 23 September 2025 19:28:05 +0000 (0:00:00.646) 0:00:12.263 ***** 2025-09-23 19:28:44.320227 | orchestrator | skipping: [testbed-node-0] 2025-09-23 19:28:44.320237 | orchestrator | skipping: [testbed-node-1] 2025-09-23 19:28:44.320246 | orchestrator | skipping: [testbed-node-2] 2025-09-23 19:28:44.320256 | orchestrator | 2025-09-23 19:28:44.320265 | orchestrator | TASK [grafana : Copying over custom dashboards] ******************************** 2025-09-23 19:28:44.320275 | orchestrator | Tuesday 23 September 2025 19:28:06 +0000 (0:00:00.427) 0:00:12.691 ***** 2025-09-23 19:28:44.320285 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph-cluster-advanced.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster-advanced.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 117836, 'inode': 1080119, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.444792, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320297 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph-cluster-advanced.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster-advanced.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 117836, 'inode': 1080119, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.444792, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320313 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph-cluster-advanced.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster-advanced.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 117836, 'inode': 1080119, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.444792, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320351 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/rbd-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25686, 'inode': 1080192, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4661233, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320363 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/rbd-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25686, 'inode': 1080192, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4661233, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320373 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/rbd-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25686, 'inode': 1080192, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4661233, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320383 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph_pools.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_pools.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25279, 'inode': 1080137, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4515421, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320393 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph_pools.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_pools.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25279, 'inode': 1080137, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4515421, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320409 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph_pools.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_pools.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25279, 'inode': 1080137, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4515421, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320427 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/rgw-s3-analytics.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rgw-s3-analytics.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 167897, 'inode': 1080196, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4679828, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320462 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/rgw-s3-analytics.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rgw-s3-analytics.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 167897, 'inode': 1080196, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4679828, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320473 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/rgw-s3-analytics.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rgw-s3-analytics.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 167897, 'inode': 1080196, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4679828, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320483 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/osd-device-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osd-device-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 26655, 'inode': 1080153, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4540577, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320493 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/osd-device-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osd-device-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 26655, 'inode': 1080153, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4540577, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320509 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/osd-device-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osd-device-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 26655, 'inode': 1080153, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4540577, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320523 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/radosgw-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 39556, 'inode': 1080173, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4633667, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320558 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/radosgw-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 39556, 'inode': 1080173, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4633667, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320569 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/radosgw-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 39556, 'inode': 1080173, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4633667, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320579 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/README.md', 'value': {'path': '/operations/grafana/dashboards/ceph/README.md', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 84, 'inode': 1080117, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4442987, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320589 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/README.md', 'value': {'path': '/operations/grafana/dashboards/ceph/README.md', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 84, 'inode': 1080117, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4442987, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320605 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/README.md', 'value': {'path': '/operations/grafana/dashboards/ceph/README.md', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 84, 'inode': 1080117, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4442987, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320615 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph-cluster.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 34113, 'inode': 1080126, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.447945, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320630 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph-cluster.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 34113, 'inode': 1080126, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.447945, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320665 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph-cluster.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 34113, 'inode': 1080126, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.447945, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320676 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/cephfs-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/cephfs-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9025, 'inode': 1080142, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4521027, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320686 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/cephfs-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/cephfs-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9025, 'inode': 1080142, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4521027, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320696 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/cephfs-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/cephfs-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9025, 'inode': 1080142, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4521027, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320712 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/pool-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19609, 'inode': 1080157, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4568474, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320726 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/pool-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19609, 'inode': 1080157, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4568474, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320743 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/pool-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19609, 'inode': 1080157, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4568474, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320753 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/rbd-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12997, 'inode': 1080187, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4644608, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320763 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/rbd-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12997, 'inode': 1080187, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4644608, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320773 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/rbd-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12997, 'inode': 1080187, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4644608, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320790 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph_overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 80386, 'inode': 1080135, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.450087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320800 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph_overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 80386, 'inode': 1080135, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.450087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320821 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph_overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 80386, 'inode': 1080135, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.450087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320832 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/radosgw-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19695, 'inode': 1080169, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4587924, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320842 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/radosgw-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19695, 'inode': 1080169, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4587924, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320852 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/radosgw-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19695, 'inode': 1080169, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4587924, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320867 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/osds-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osds-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38432, 'inode': 1080154, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4551551, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320877 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/osds-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osds-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38432, 'inode': 1080154, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4551551, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320891 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/osds-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osds-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38432, 'inode': 1080154, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4551551, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320923 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/multi-cluster-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/multi-cluster-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 62676, 'inode': 1080151, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4540577, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320934 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/multi-cluster-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/multi-cluster-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 62676, 'inode': 1080151, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4540577, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320945 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/multi-cluster-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/multi-cluster-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 62676, 'inode': 1080151, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4540577, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320960 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/hosts-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/hosts-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 27218, 'inode': 1080149, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4537134, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320970 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/hosts-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/hosts-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 27218, 'inode': 1080149, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4537134, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.320984 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/hosts-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/hosts-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 27218, 'inode': 1080149, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4537134, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321001 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/pool-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 49139, 'inode': 1080163, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4583151, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321012 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/pool-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 49139, 'inode': 1080163, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4583151, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321030 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/pool-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 49139, 'inode': 1080163, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4583151, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321056 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/host-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/host-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 44791, 'inode': 1080145, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4532206, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321074 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/host-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/host-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 44791, 'inode': 1080145, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4532206, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321091 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/host-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/host-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 44791, 'inode': 1080145, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4532206, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321121 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/radosgw-sync-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-sync-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16156, 'inode': 1080181, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4641714, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321140 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/radosgw-sync-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-sync-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16156, 'inode': 1080181, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4641714, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321158 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/radosgw-sync-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-sync-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16156, 'inode': 1080181, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4641714, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321193 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openstack/openstack.json', 'value': {'path': '/operations/grafana/dashboards/openstack/openstack.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 57270, 'inode': 1080337, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5147445, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321211 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openstack/openstack.json', 'value': {'path': '/operations/grafana/dashboards/openstack/openstack.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 57270, 'inode': 1080337, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5147445, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321230 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openstack/openstack.json', 'value': {'path': '/operations/grafana/dashboards/openstack/openstack.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 57270, 'inode': 1080337, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5147445, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321254 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/haproxy.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/haproxy.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 410814, 'inode': 1080227, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.483297, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321276 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/haproxy.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/haproxy.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 410814, 'inode': 1080227, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.483297, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321287 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/haproxy.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/haproxy.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 410814, 'inode': 1080227, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.483297, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321304 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/database.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/database.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 30898, 'inode': 1080218, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4707925, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321314 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/database.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/database.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 30898, 'inode': 1080218, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4707925, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321324 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/database.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/database.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 30898, 'inode': 1080218, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4707925, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321339 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 15725, 'inode': 1080255, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4867446, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321355 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 15725, 'inode': 1080255, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4867446, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321365 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 15725, 'inode': 1080255, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4867446, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321381 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/alertmanager-overview.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/alertmanager-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9645, 'inode': 1080209, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4687781, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321391 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/alertmanager-overview.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/alertmanager-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9645, 'inode': 1080209, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4687781, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321402 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/alertmanager-overview.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/alertmanager-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9645, 'inode': 1080209, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4687781, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321416 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/opensearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/opensearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 65458, 'inode': 1080288, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.501436, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321432 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/opensearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/opensearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 65458, 'inode': 1080288, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.501436, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321442 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/opensearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/opensearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 65458, 'inode': 1080288, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.501436, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321457 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node_exporter_full.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_full.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 682774, 'inode': 1080258, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4958074, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321467 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node_exporter_full.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_full.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 682774, 'inode': 1080258, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4958074, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321477 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node_exporter_full.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_full.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 682774, 'inode': 1080258, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4958074, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321487 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/prometheus-remote-write.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus-remote-write.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 22317, 'inode': 1080296, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.502198, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321506 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/prometheus-remote-write.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus-remote-write.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 22317, 'inode': 1080296, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.502198, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321517 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/prometheus-remote-write.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus-remote-write.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 22317, 'inode': 1080296, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.502198, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321532 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/redfish.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/redfish.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38087, 'inode': 1080326, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5121129, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321542 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/redfish.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/redfish.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38087, 'inode': 1080326, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5121129, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321552 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/redfish.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/redfish.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38087, 'inode': 1080326, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5121129, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321562 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/nodes.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/nodes.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21109, 'inode': 1080286, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4985316, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321576 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/nodes.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/nodes.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21109, 'inode': 1080286, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4985316, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321591 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/nodes.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/nodes.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21109, 'inode': 1080286, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4985316, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321609 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/memcached.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/memcached.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 24243, 'inode': 1080250, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4859216, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321620 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/memcached.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/memcached.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 24243, 'inode': 1080250, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4859216, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321630 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/memcached.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/memcached.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 24243, 'inode': 1080250, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4859216, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321640 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/fluentd.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/fluentd.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 82960, 'inode': 1080220, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4747927, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321654 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/fluentd.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/fluentd.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 82960, 'inode': 1080220, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4747927, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321670 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/fluentd.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/fluentd.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 82960, 'inode': 1080220, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4747927, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321686 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/libvirt.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/libvirt.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 29672, 'inode': 1080247, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4837928, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321696 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/libvirt.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/libvirt.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 29672, 'inode': 1080247, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4837928, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321706 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/libvirt.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/libvirt.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 29672, 'inode': 1080247, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4837928, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321716 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/elasticsearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/elasticsearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 187864, 'inode': 1080219, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4727926, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321730 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/elasticsearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/elasticsearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 187864, 'inode': 1080219, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4727926, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321747 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/elasticsearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/elasticsearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 187864, 'inode': 1080219, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4727926, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321762 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node-cluster-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-cluster-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16098, 'inode': 1080254, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4859216, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321772 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node-cluster-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-cluster-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16098, 'inode': 1080254, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4859216, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321782 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node-cluster-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-cluster-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16098, 'inode': 1080254, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4859216, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321792 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/rabbitmq.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/rabbitmq.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 222049, 'inode': 1080308, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5097933, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321802 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/rabbitmq.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/rabbitmq.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 222049, 'inode': 1080308, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5097933, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321824 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/rabbitmq.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/rabbitmq.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 222049, 'inode': 1080308, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5097933, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321840 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/prometheus_alertmanager.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus_alertmanager.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 115472, 'inode': 1080302, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5037932, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321850 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/prometheus_alertmanager.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus_alertmanager.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 115472, 'inode': 1080302, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5037932, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321860 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/prometheus_alertmanager.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus_alertmanager.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 115472, 'inode': 1080302, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.5037932, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321870 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/blackbox.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/blackbox.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 31128, 'inode': 1080212, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4695761, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321880 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/blackbox.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/blackbox.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 31128, 'inode': 1080212, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4695761, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321916 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/blackbox.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/blackbox.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 31128, 'inode': 1080212, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4695761, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321933 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/cadvisor.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/cadvisor.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 53882, 'inode': 1080216, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4703166, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321943 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/cadvisor.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/cadvisor.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 53882, 'inode': 1080216, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4703166, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321954 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/cadvisor.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/cadvisor.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 53882, 'inode': 1080216, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4703166, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321964 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node_exporter_side_by_side.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_side_by_side.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 70691, 'inode': 1080281, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4985316, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321974 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node_exporter_side_by_side.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_side_by_side.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 70691, 'inode': 1080281, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4985316, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.321989 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node_exporter_side_by_side.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_side_by_side.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 70691, 'inode': 1080281, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.4985316, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.322065 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/prometheus.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21898, 'inode': 1080298, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.503372, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.322082 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/prometheus.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21898, 'inode': 1080298, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.503372, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.322092 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/prometheus.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21898, 'inode': 1080298, 'dev': 109, 'nlink': 1, 'atime': 1758652621.0, 'mtime': 1758652621.0, 'ctime': 1758653087.503372, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-23 19:28:44.322102 | orchestrator | 2025-09-23 19:28:44.322113 | orchestrator | TASK [grafana : Check grafana containers] ************************************** 2025-09-23 19:28:44.322123 | orchestrator | Tuesday 23 September 2025 19:28:40 +0000 (0:00:34.797) 0:00:47.488 ***** 2025-09-23 19:28:44.322133 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.322143 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.322163 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-23 19:28:44.322173 | orchestrator | 2025-09-23 19:28:44.322189 | orchestrator | TASK [grafana : Creating grafana database] ************************************* 2025-09-23 19:28:44.322199 | orchestrator | Tuesday 23 September 2025 19:28:41 +0000 (0:00:00.930) 0:00:48.419 ***** 2025-09-23 19:28:44.322209 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-23 19:28:44.322219 | orchestrator | 2025-09-23 19:28:44.322228 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-23 19:28:44.322238 | orchestrator | testbed-node-0 : ok=15  changed=8  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-23 19:28:44.322248 | orchestrator | testbed-node-1 : ok=13  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2025-09-23 19:28:44.322258 | orchestrator | testbed-node-2 : ok=13  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2025-09-23 19:28:44.322268 | orchestrator | 2025-09-23 19:28:44.322277 | orchestrator | 2025-09-23 19:28:44.322287 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-23 19:28:44.322296 | orchestrator | Tuesday 23 September 2025 19:28:42 +0000 (0:00:00.705) 0:00:49.125 ***** 2025-09-23 19:28:44.322306 | orchestrator | =============================================================================== 2025-09-23 19:28:44.322315 | orchestrator | grafana : Copying over custom dashboards ------------------------------- 34.80s 2025-09-23 19:28:44.322325 | orchestrator | service-cert-copy : grafana | Copying over extra CA certificates -------- 1.27s 2025-09-23 19:28:44.322334 | orchestrator | grafana : Copying over grafana.ini -------------------------------------- 1.27s 2025-09-23 19:28:44.322344 | orchestrator | grafana : Configuring dashboards provisioning --------------------------- 1.22s 2025-09-23 19:28:44.322353 | orchestrator | grafana : Configuring Prometheus as data source for Grafana ------------- 1.22s 2025-09-23 19:28:44.322363 | orchestrator | grafana : Copying over config.json files -------------------------------- 1.19s 2025-09-23 19:28:44.322372 | orchestrator | grafana : Check grafana containers -------------------------------------- 0.93s 2025-09-23 19:28:44.322382 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS key ----- 0.75s 2025-09-23 19:28:44.322391 | orchestrator | grafana : Check if extra configuration file exists ---------------------- 0.74s 2025-09-23 19:28:44.322401 | orchestrator | grafana : Ensuring config directories exist ----------------------------- 0.72s 2025-09-23 19:28:44.322410 | orchestrator | grafana : Creating grafana database ------------------------------------- 0.71s 2025-09-23 19:28:44.322420 | orchestrator | grafana : Find custom grafana dashboards -------------------------------- 0.68s 2025-09-23 19:28:44.322429 | orchestrator | grafana : Find templated grafana dashboards ----------------------------- 0.65s 2025-09-23 19:28:44.322439 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.56s 2025-09-23 19:28:44.322448 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.47s 2025-09-23 19:28:44.322463 | orchestrator | grafana : Prune templated Grafana dashboards ---------------------------- 0.43s 2025-09-23 19:28:44.322473 | orchestrator | grafana : Copying over extra configuration file ------------------------- 0.36s 2025-09-23 19:28:44.322482 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.34s 2025-09-23 19:28:44.322492 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS certificate --- 0.31s 2025-09-23 19:28:44.322502 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.27s 2025-09-23 19:28:47.354260 | orchestrator | 2025-09-23 19:28:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:47.354648 | orchestrator | 2025-09-23 19:28:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:47.354678 | orchestrator | 2025-09-23 19:28:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:50.396294 | orchestrator | 2025-09-23 19:28:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:50.397838 | orchestrator | 2025-09-23 19:28:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:50.397874 | orchestrator | 2025-09-23 19:28:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:53.441793 | orchestrator | 2025-09-23 19:28:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:53.443424 | orchestrator | 2025-09-23 19:28:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:53.443470 | orchestrator | 2025-09-23 19:28:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:56.479652 | orchestrator | 2025-09-23 19:28:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:56.481210 | orchestrator | 2025-09-23 19:28:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:56.481242 | orchestrator | 2025-09-23 19:28:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:28:59.526364 | orchestrator | 2025-09-23 19:28:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:28:59.528489 | orchestrator | 2025-09-23 19:28:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:28:59.528522 | orchestrator | 2025-09-23 19:28:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:02.570838 | orchestrator | 2025-09-23 19:29:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:02.573209 | orchestrator | 2025-09-23 19:29:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:02.573501 | orchestrator | 2025-09-23 19:29:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:05.613369 | orchestrator | 2025-09-23 19:29:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:05.615031 | orchestrator | 2025-09-23 19:29:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:05.615060 | orchestrator | 2025-09-23 19:29:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:08.654541 | orchestrator | 2025-09-23 19:29:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:08.655817 | orchestrator | 2025-09-23 19:29:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:08.655844 | orchestrator | 2025-09-23 19:29:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:11.697518 | orchestrator | 2025-09-23 19:29:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:11.699379 | orchestrator | 2025-09-23 19:29:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:11.699408 | orchestrator | 2025-09-23 19:29:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:14.740502 | orchestrator | 2025-09-23 19:29:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:14.742505 | orchestrator | 2025-09-23 19:29:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:14.742538 | orchestrator | 2025-09-23 19:29:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:17.785042 | orchestrator | 2025-09-23 19:29:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:17.786829 | orchestrator | 2025-09-23 19:29:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:17.786865 | orchestrator | 2025-09-23 19:29:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:20.824062 | orchestrator | 2025-09-23 19:29:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:20.826186 | orchestrator | 2025-09-23 19:29:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:20.826234 | orchestrator | 2025-09-23 19:29:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:23.864163 | orchestrator | 2025-09-23 19:29:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:23.865514 | orchestrator | 2025-09-23 19:29:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:23.865541 | orchestrator | 2025-09-23 19:29:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:26.906813 | orchestrator | 2025-09-23 19:29:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:26.907405 | orchestrator | 2025-09-23 19:29:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:26.907748 | orchestrator | 2025-09-23 19:29:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:29.953042 | orchestrator | 2025-09-23 19:29:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:29.954442 | orchestrator | 2025-09-23 19:29:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:29.955174 | orchestrator | 2025-09-23 19:29:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:32.995194 | orchestrator | 2025-09-23 19:29:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:32.996395 | orchestrator | 2025-09-23 19:29:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:32.996677 | orchestrator | 2025-09-23 19:29:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:36.039033 | orchestrator | 2025-09-23 19:29:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:36.040291 | orchestrator | 2025-09-23 19:29:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:36.040391 | orchestrator | 2025-09-23 19:29:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:39.087427 | orchestrator | 2025-09-23 19:29:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:39.089357 | orchestrator | 2025-09-23 19:29:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:39.089673 | orchestrator | 2025-09-23 19:29:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:42.129666 | orchestrator | 2025-09-23 19:29:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:42.131070 | orchestrator | 2025-09-23 19:29:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:42.131304 | orchestrator | 2025-09-23 19:29:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:45.170673 | orchestrator | 2025-09-23 19:29:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:45.171674 | orchestrator | 2025-09-23 19:29:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:45.171938 | orchestrator | 2025-09-23 19:29:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:48.206174 | orchestrator | 2025-09-23 19:29:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:48.207437 | orchestrator | 2025-09-23 19:29:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:48.207468 | orchestrator | 2025-09-23 19:29:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:51.251506 | orchestrator | 2025-09-23 19:29:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:51.254176 | orchestrator | 2025-09-23 19:29:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:51.254257 | orchestrator | 2025-09-23 19:29:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:54.304361 | orchestrator | 2025-09-23 19:29:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:54.306408 | orchestrator | 2025-09-23 19:29:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:54.306645 | orchestrator | 2025-09-23 19:29:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:29:57.350153 | orchestrator | 2025-09-23 19:29:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:29:57.351881 | orchestrator | 2025-09-23 19:29:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:29:57.352099 | orchestrator | 2025-09-23 19:29:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:00.393600 | orchestrator | 2025-09-23 19:30:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:00.396103 | orchestrator | 2025-09-23 19:30:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:00.396163 | orchestrator | 2025-09-23 19:30:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:03.434947 | orchestrator | 2025-09-23 19:30:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:03.435150 | orchestrator | 2025-09-23 19:30:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:03.435167 | orchestrator | 2025-09-23 19:30:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:06.478491 | orchestrator | 2025-09-23 19:30:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:06.480627 | orchestrator | 2025-09-23 19:30:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:06.480677 | orchestrator | 2025-09-23 19:30:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:09.520933 | orchestrator | 2025-09-23 19:30:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:09.523276 | orchestrator | 2025-09-23 19:30:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:09.523355 | orchestrator | 2025-09-23 19:30:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:12.565758 | orchestrator | 2025-09-23 19:30:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:12.566561 | orchestrator | 2025-09-23 19:30:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:12.566936 | orchestrator | 2025-09-23 19:30:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:15.606182 | orchestrator | 2025-09-23 19:30:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:15.608355 | orchestrator | 2025-09-23 19:30:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:15.608381 | orchestrator | 2025-09-23 19:30:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:18.647922 | orchestrator | 2025-09-23 19:30:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:18.649467 | orchestrator | 2025-09-23 19:30:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:18.649491 | orchestrator | 2025-09-23 19:30:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:21.695711 | orchestrator | 2025-09-23 19:30:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:21.697502 | orchestrator | 2025-09-23 19:30:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:21.697535 | orchestrator | 2025-09-23 19:30:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:24.746582 | orchestrator | 2025-09-23 19:30:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:24.747634 | orchestrator | 2025-09-23 19:30:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:24.747669 | orchestrator | 2025-09-23 19:30:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:27.789554 | orchestrator | 2025-09-23 19:30:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:27.790204 | orchestrator | 2025-09-23 19:30:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:27.790941 | orchestrator | 2025-09-23 19:30:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:30.834523 | orchestrator | 2025-09-23 19:30:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:30.836082 | orchestrator | 2025-09-23 19:30:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:30.836113 | orchestrator | 2025-09-23 19:30:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:33.880060 | orchestrator | 2025-09-23 19:30:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:33.881987 | orchestrator | 2025-09-23 19:30:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:33.882100 | orchestrator | 2025-09-23 19:30:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:36.920312 | orchestrator | 2025-09-23 19:30:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:36.921917 | orchestrator | 2025-09-23 19:30:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:36.921948 | orchestrator | 2025-09-23 19:30:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:39.959973 | orchestrator | 2025-09-23 19:30:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:39.960996 | orchestrator | 2025-09-23 19:30:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:39.961026 | orchestrator | 2025-09-23 19:30:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:43.003753 | orchestrator | 2025-09-23 19:30:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:43.005347 | orchestrator | 2025-09-23 19:30:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:43.005392 | orchestrator | 2025-09-23 19:30:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:46.043684 | orchestrator | 2025-09-23 19:30:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:46.044200 | orchestrator | 2025-09-23 19:30:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:46.044410 | orchestrator | 2025-09-23 19:30:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:49.092984 | orchestrator | 2025-09-23 19:30:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:49.094524 | orchestrator | 2025-09-23 19:30:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:49.094562 | orchestrator | 2025-09-23 19:30:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:52.141160 | orchestrator | 2025-09-23 19:30:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:52.142430 | orchestrator | 2025-09-23 19:30:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:52.142464 | orchestrator | 2025-09-23 19:30:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:55.187060 | orchestrator | 2025-09-23 19:30:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:55.187212 | orchestrator | 2025-09-23 19:30:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:55.187229 | orchestrator | 2025-09-23 19:30:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:30:58.235206 | orchestrator | 2025-09-23 19:30:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:30:58.235882 | orchestrator | 2025-09-23 19:30:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:30:58.235919 | orchestrator | 2025-09-23 19:30:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:01.281703 | orchestrator | 2025-09-23 19:31:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:01.283149 | orchestrator | 2025-09-23 19:31:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:01.283188 | orchestrator | 2025-09-23 19:31:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:04.331621 | orchestrator | 2025-09-23 19:31:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:04.333390 | orchestrator | 2025-09-23 19:31:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:04.333445 | orchestrator | 2025-09-23 19:31:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:07.379194 | orchestrator | 2025-09-23 19:31:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:07.379767 | orchestrator | 2025-09-23 19:31:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:07.379801 | orchestrator | 2025-09-23 19:31:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:10.421169 | orchestrator | 2025-09-23 19:31:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:10.422225 | orchestrator | 2025-09-23 19:31:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:10.422263 | orchestrator | 2025-09-23 19:31:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:13.465266 | orchestrator | 2025-09-23 19:31:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:13.466816 | orchestrator | 2025-09-23 19:31:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:13.466830 | orchestrator | 2025-09-23 19:31:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:16.505122 | orchestrator | 2025-09-23 19:31:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:16.506450 | orchestrator | 2025-09-23 19:31:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:16.506536 | orchestrator | 2025-09-23 19:31:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:19.547148 | orchestrator | 2025-09-23 19:31:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:19.548759 | orchestrator | 2025-09-23 19:31:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:19.548791 | orchestrator | 2025-09-23 19:31:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:22.602171 | orchestrator | 2025-09-23 19:31:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:22.603934 | orchestrator | 2025-09-23 19:31:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:22.603996 | orchestrator | 2025-09-23 19:31:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:25.649533 | orchestrator | 2025-09-23 19:31:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:25.651329 | orchestrator | 2025-09-23 19:31:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:25.651354 | orchestrator | 2025-09-23 19:31:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:28.695973 | orchestrator | 2025-09-23 19:31:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:28.696073 | orchestrator | 2025-09-23 19:31:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:28.696088 | orchestrator | 2025-09-23 19:31:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:31.743634 | orchestrator | 2025-09-23 19:31:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:31.745475 | orchestrator | 2025-09-23 19:31:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:31.745527 | orchestrator | 2025-09-23 19:31:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:34.792824 | orchestrator | 2025-09-23 19:31:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:34.794393 | orchestrator | 2025-09-23 19:31:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:34.794429 | orchestrator | 2025-09-23 19:31:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:37.838503 | orchestrator | 2025-09-23 19:31:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:37.839344 | orchestrator | 2025-09-23 19:31:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:37.839407 | orchestrator | 2025-09-23 19:31:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:40.882723 | orchestrator | 2025-09-23 19:31:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:40.883839 | orchestrator | 2025-09-23 19:31:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:40.883870 | orchestrator | 2025-09-23 19:31:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:43.927340 | orchestrator | 2025-09-23 19:31:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:43.929039 | orchestrator | 2025-09-23 19:31:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:43.929270 | orchestrator | 2025-09-23 19:31:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:46.976024 | orchestrator | 2025-09-23 19:31:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:46.978376 | orchestrator | 2025-09-23 19:31:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:46.978435 | orchestrator | 2025-09-23 19:31:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:50.024465 | orchestrator | 2025-09-23 19:31:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:50.025629 | orchestrator | 2025-09-23 19:31:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:50.025710 | orchestrator | 2025-09-23 19:31:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:53.065416 | orchestrator | 2025-09-23 19:31:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:53.066258 | orchestrator | 2025-09-23 19:31:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:53.066651 | orchestrator | 2025-09-23 19:31:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:56.106299 | orchestrator | 2025-09-23 19:31:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:56.108942 | orchestrator | 2025-09-23 19:31:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:56.108976 | orchestrator | 2025-09-23 19:31:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:31:59.151754 | orchestrator | 2025-09-23 19:31:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:31:59.153355 | orchestrator | 2025-09-23 19:31:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:31:59.153725 | orchestrator | 2025-09-23 19:31:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:02.199633 | orchestrator | 2025-09-23 19:32:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:02.200433 | orchestrator | 2025-09-23 19:32:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:02.200464 | orchestrator | 2025-09-23 19:32:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:05.244534 | orchestrator | 2025-09-23 19:32:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:05.245746 | orchestrator | 2025-09-23 19:32:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:05.245965 | orchestrator | 2025-09-23 19:32:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:08.284959 | orchestrator | 2025-09-23 19:32:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:08.286551 | orchestrator | 2025-09-23 19:32:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:08.286694 | orchestrator | 2025-09-23 19:32:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:11.331386 | orchestrator | 2025-09-23 19:32:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:11.333564 | orchestrator | 2025-09-23 19:32:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:11.333604 | orchestrator | 2025-09-23 19:32:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:14.381106 | orchestrator | 2025-09-23 19:32:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:14.382842 | orchestrator | 2025-09-23 19:32:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:14.382908 | orchestrator | 2025-09-23 19:32:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:17.429206 | orchestrator | 2025-09-23 19:32:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:17.431728 | orchestrator | 2025-09-23 19:32:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:17.431785 | orchestrator | 2025-09-23 19:32:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:20.474356 | orchestrator | 2025-09-23 19:32:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:20.475977 | orchestrator | 2025-09-23 19:32:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:20.476028 | orchestrator | 2025-09-23 19:32:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:23.520922 | orchestrator | 2025-09-23 19:32:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:23.522974 | orchestrator | 2025-09-23 19:32:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:23.523013 | orchestrator | 2025-09-23 19:32:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:26.561301 | orchestrator | 2025-09-23 19:32:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:26.562982 | orchestrator | 2025-09-23 19:32:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:26.563045 | orchestrator | 2025-09-23 19:32:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:29.611282 | orchestrator | 2025-09-23 19:32:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:29.614702 | orchestrator | 2025-09-23 19:32:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:29.614736 | orchestrator | 2025-09-23 19:32:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:32.660908 | orchestrator | 2025-09-23 19:32:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:32.662770 | orchestrator | 2025-09-23 19:32:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:32.662820 | orchestrator | 2025-09-23 19:32:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:35.718128 | orchestrator | 2025-09-23 19:32:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:35.718984 | orchestrator | 2025-09-23 19:32:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:35.719040 | orchestrator | 2025-09-23 19:32:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:38.769230 | orchestrator | 2025-09-23 19:32:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:38.772280 | orchestrator | 2025-09-23 19:32:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:38.772404 | orchestrator | 2025-09-23 19:32:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:41.812369 | orchestrator | 2025-09-23 19:32:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:41.814264 | orchestrator | 2025-09-23 19:32:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:41.814293 | orchestrator | 2025-09-23 19:32:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:44.857424 | orchestrator | 2025-09-23 19:32:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:44.859279 | orchestrator | 2025-09-23 19:32:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:44.859315 | orchestrator | 2025-09-23 19:32:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:47.894707 | orchestrator | 2025-09-23 19:32:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:47.896317 | orchestrator | 2025-09-23 19:32:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:47.896350 | orchestrator | 2025-09-23 19:32:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:50.937992 | orchestrator | 2025-09-23 19:32:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:50.939551 | orchestrator | 2025-09-23 19:32:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:50.939625 | orchestrator | 2025-09-23 19:32:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:53.987629 | orchestrator | 2025-09-23 19:32:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:53.990282 | orchestrator | 2025-09-23 19:32:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:53.990322 | orchestrator | 2025-09-23 19:32:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:32:57.034691 | orchestrator | 2025-09-23 19:32:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:32:57.036404 | orchestrator | 2025-09-23 19:32:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:32:57.036440 | orchestrator | 2025-09-23 19:32:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:00.072823 | orchestrator | 2025-09-23 19:33:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:00.073584 | orchestrator | 2025-09-23 19:33:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:00.073628 | orchestrator | 2025-09-23 19:33:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:03.123343 | orchestrator | 2025-09-23 19:33:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:03.124770 | orchestrator | 2025-09-23 19:33:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:03.124812 | orchestrator | 2025-09-23 19:33:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:06.168945 | orchestrator | 2025-09-23 19:33:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:06.170437 | orchestrator | 2025-09-23 19:33:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:06.170474 | orchestrator | 2025-09-23 19:33:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:09.219354 | orchestrator | 2025-09-23 19:33:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:09.220063 | orchestrator | 2025-09-23 19:33:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:09.220208 | orchestrator | 2025-09-23 19:33:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:12.262509 | orchestrator | 2025-09-23 19:33:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:12.263226 | orchestrator | 2025-09-23 19:33:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:12.263822 | orchestrator | 2025-09-23 19:33:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:15.316692 | orchestrator | 2025-09-23 19:33:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:15.318896 | orchestrator | 2025-09-23 19:33:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:15.318931 | orchestrator | 2025-09-23 19:33:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:18.364793 | orchestrator | 2025-09-23 19:33:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:18.366259 | orchestrator | 2025-09-23 19:33:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:18.366295 | orchestrator | 2025-09-23 19:33:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:21.408615 | orchestrator | 2025-09-23 19:33:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:21.409219 | orchestrator | 2025-09-23 19:33:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:21.409489 | orchestrator | 2025-09-23 19:33:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:24.453429 | orchestrator | 2025-09-23 19:33:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:24.456370 | orchestrator | 2025-09-23 19:33:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:24.456476 | orchestrator | 2025-09-23 19:33:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:27.505066 | orchestrator | 2025-09-23 19:33:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:27.507937 | orchestrator | 2025-09-23 19:33:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:27.508014 | orchestrator | 2025-09-23 19:33:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:30.555017 | orchestrator | 2025-09-23 19:33:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:30.555929 | orchestrator | 2025-09-23 19:33:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:30.556032 | orchestrator | 2025-09-23 19:33:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:33.602089 | orchestrator | 2025-09-23 19:33:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:33.604499 | orchestrator | 2025-09-23 19:33:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:33.604596 | orchestrator | 2025-09-23 19:33:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:36.646855 | orchestrator | 2025-09-23 19:33:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:36.648379 | orchestrator | 2025-09-23 19:33:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:36.648489 | orchestrator | 2025-09-23 19:33:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:39.687306 | orchestrator | 2025-09-23 19:33:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:39.689328 | orchestrator | 2025-09-23 19:33:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:39.689373 | orchestrator | 2025-09-23 19:33:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:42.741730 | orchestrator | 2025-09-23 19:33:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:42.742763 | orchestrator | 2025-09-23 19:33:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:42.742804 | orchestrator | 2025-09-23 19:33:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:45.783172 | orchestrator | 2025-09-23 19:33:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:45.784963 | orchestrator | 2025-09-23 19:33:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:45.784995 | orchestrator | 2025-09-23 19:33:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:48.833222 | orchestrator | 2025-09-23 19:33:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:48.834932 | orchestrator | 2025-09-23 19:33:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:48.835130 | orchestrator | 2025-09-23 19:33:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:51.881113 | orchestrator | 2025-09-23 19:33:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:51.883078 | orchestrator | 2025-09-23 19:33:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:51.883113 | orchestrator | 2025-09-23 19:33:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:54.924197 | orchestrator | 2025-09-23 19:33:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:54.925974 | orchestrator | 2025-09-23 19:33:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:54.926073 | orchestrator | 2025-09-23 19:33:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:33:57.976093 | orchestrator | 2025-09-23 19:33:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:33:57.978139 | orchestrator | 2025-09-23 19:33:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:33:57.978242 | orchestrator | 2025-09-23 19:33:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:01.026276 | orchestrator | 2025-09-23 19:34:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:01.027374 | orchestrator | 2025-09-23 19:34:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:01.027407 | orchestrator | 2025-09-23 19:34:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:04.071472 | orchestrator | 2025-09-23 19:34:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:04.072701 | orchestrator | 2025-09-23 19:34:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:04.072724 | orchestrator | 2025-09-23 19:34:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:07.116100 | orchestrator | 2025-09-23 19:34:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:07.118612 | orchestrator | 2025-09-23 19:34:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:07.152195 | orchestrator | 2025-09-23 19:34:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:10.163356 | orchestrator | 2025-09-23 19:34:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:10.166150 | orchestrator | 2025-09-23 19:34:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:10.166217 | orchestrator | 2025-09-23 19:34:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:13.211749 | orchestrator | 2025-09-23 19:34:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:13.212981 | orchestrator | 2025-09-23 19:34:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:13.213013 | orchestrator | 2025-09-23 19:34:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:16.261086 | orchestrator | 2025-09-23 19:34:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:16.263115 | orchestrator | 2025-09-23 19:34:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:16.263152 | orchestrator | 2025-09-23 19:34:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:19.310891 | orchestrator | 2025-09-23 19:34:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:19.312220 | orchestrator | 2025-09-23 19:34:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:19.312507 | orchestrator | 2025-09-23 19:34:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:22.356614 | orchestrator | 2025-09-23 19:34:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:22.358137 | orchestrator | 2025-09-23 19:34:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:22.358172 | orchestrator | 2025-09-23 19:34:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:25.397972 | orchestrator | 2025-09-23 19:34:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:25.399749 | orchestrator | 2025-09-23 19:34:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:25.399789 | orchestrator | 2025-09-23 19:34:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:28.446764 | orchestrator | 2025-09-23 19:34:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:28.448658 | orchestrator | 2025-09-23 19:34:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:28.448705 | orchestrator | 2025-09-23 19:34:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:31.494980 | orchestrator | 2025-09-23 19:34:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:31.496178 | orchestrator | 2025-09-23 19:34:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:31.496264 | orchestrator | 2025-09-23 19:34:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:34.544796 | orchestrator | 2025-09-23 19:34:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:34.547188 | orchestrator | 2025-09-23 19:34:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:34.547884 | orchestrator | 2025-09-23 19:34:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:37.590318 | orchestrator | 2025-09-23 19:34:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:37.592490 | orchestrator | 2025-09-23 19:34:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:37.592535 | orchestrator | 2025-09-23 19:34:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:40.640286 | orchestrator | 2025-09-23 19:34:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:40.641954 | orchestrator | 2025-09-23 19:34:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:40.642088 | orchestrator | 2025-09-23 19:34:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:43.685376 | orchestrator | 2025-09-23 19:34:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:43.687021 | orchestrator | 2025-09-23 19:34:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:43.687064 | orchestrator | 2025-09-23 19:34:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:46.731213 | orchestrator | 2025-09-23 19:34:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:46.732613 | orchestrator | 2025-09-23 19:34:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:46.732778 | orchestrator | 2025-09-23 19:34:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:49.782949 | orchestrator | 2025-09-23 19:34:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:49.784742 | orchestrator | 2025-09-23 19:34:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:49.784793 | orchestrator | 2025-09-23 19:34:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:52.828954 | orchestrator | 2025-09-23 19:34:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:52.830153 | orchestrator | 2025-09-23 19:34:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:52.830231 | orchestrator | 2025-09-23 19:34:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:55.872048 | orchestrator | 2025-09-23 19:34:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:55.873211 | orchestrator | 2025-09-23 19:34:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:55.873260 | orchestrator | 2025-09-23 19:34:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:34:58.918761 | orchestrator | 2025-09-23 19:34:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:34:58.919646 | orchestrator | 2025-09-23 19:34:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:34:58.919739 | orchestrator | 2025-09-23 19:34:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:01.966887 | orchestrator | 2025-09-23 19:35:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:01.968295 | orchestrator | 2025-09-23 19:35:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:01.968346 | orchestrator | 2025-09-23 19:35:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:05.016689 | orchestrator | 2025-09-23 19:35:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:05.018389 | orchestrator | 2025-09-23 19:35:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:05.018676 | orchestrator | 2025-09-23 19:35:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:08.057812 | orchestrator | 2025-09-23 19:35:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:08.059728 | orchestrator | 2025-09-23 19:35:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:08.059814 | orchestrator | 2025-09-23 19:35:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:11.103393 | orchestrator | 2025-09-23 19:35:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:11.105444 | orchestrator | 2025-09-23 19:35:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:11.105475 | orchestrator | 2025-09-23 19:35:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:14.150750 | orchestrator | 2025-09-23 19:35:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:14.152655 | orchestrator | 2025-09-23 19:35:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:14.152697 | orchestrator | 2025-09-23 19:35:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:17.200774 | orchestrator | 2025-09-23 19:35:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:17.204419 | orchestrator | 2025-09-23 19:35:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:17.204513 | orchestrator | 2025-09-23 19:35:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:20.253093 | orchestrator | 2025-09-23 19:35:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:20.254635 | orchestrator | 2025-09-23 19:35:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:20.254683 | orchestrator | 2025-09-23 19:35:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:23.301865 | orchestrator | 2025-09-23 19:35:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:23.303799 | orchestrator | 2025-09-23 19:35:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:23.303883 | orchestrator | 2025-09-23 19:35:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:26.343702 | orchestrator | 2025-09-23 19:35:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:26.345694 | orchestrator | 2025-09-23 19:35:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:26.345750 | orchestrator | 2025-09-23 19:35:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:29.393929 | orchestrator | 2025-09-23 19:35:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:29.395586 | orchestrator | 2025-09-23 19:35:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:29.395774 | orchestrator | 2025-09-23 19:35:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:32.443303 | orchestrator | 2025-09-23 19:35:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:32.443979 | orchestrator | 2025-09-23 19:35:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:32.444014 | orchestrator | 2025-09-23 19:35:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:35.494294 | orchestrator | 2025-09-23 19:35:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:35.496533 | orchestrator | 2025-09-23 19:35:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:35.496623 | orchestrator | 2025-09-23 19:35:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:38.540293 | orchestrator | 2025-09-23 19:35:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:38.542514 | orchestrator | 2025-09-23 19:35:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:38.543068 | orchestrator | 2025-09-23 19:35:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:41.587549 | orchestrator | 2025-09-23 19:35:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:41.588348 | orchestrator | 2025-09-23 19:35:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:41.588508 | orchestrator | 2025-09-23 19:35:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:44.631636 | orchestrator | 2025-09-23 19:35:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:44.633025 | orchestrator | 2025-09-23 19:35:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:44.633059 | orchestrator | 2025-09-23 19:35:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:47.674877 | orchestrator | 2025-09-23 19:35:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:47.675967 | orchestrator | 2025-09-23 19:35:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:47.676004 | orchestrator | 2025-09-23 19:35:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:50.720456 | orchestrator | 2025-09-23 19:35:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:50.722774 | orchestrator | 2025-09-23 19:35:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:50.722819 | orchestrator | 2025-09-23 19:35:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:53.761339 | orchestrator | 2025-09-23 19:35:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:53.762524 | orchestrator | 2025-09-23 19:35:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:53.762562 | orchestrator | 2025-09-23 19:35:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:56.804857 | orchestrator | 2025-09-23 19:35:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:56.806944 | orchestrator | 2025-09-23 19:35:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:56.806981 | orchestrator | 2025-09-23 19:35:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:35:59.857086 | orchestrator | 2025-09-23 19:35:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:35:59.859826 | orchestrator | 2025-09-23 19:35:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:35:59.860051 | orchestrator | 2025-09-23 19:35:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:02.907869 | orchestrator | 2025-09-23 19:36:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:02.910491 | orchestrator | 2025-09-23 19:36:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:02.910528 | orchestrator | 2025-09-23 19:36:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:05.958302 | orchestrator | 2025-09-23 19:36:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:05.959866 | orchestrator | 2025-09-23 19:36:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:05.959900 | orchestrator | 2025-09-23 19:36:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:09.001541 | orchestrator | 2025-09-23 19:36:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:09.004269 | orchestrator | 2025-09-23 19:36:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:09.004305 | orchestrator | 2025-09-23 19:36:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:12.050235 | orchestrator | 2025-09-23 19:36:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:12.051540 | orchestrator | 2025-09-23 19:36:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:12.051686 | orchestrator | 2025-09-23 19:36:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:15.103551 | orchestrator | 2025-09-23 19:36:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:15.105603 | orchestrator | 2025-09-23 19:36:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:15.105625 | orchestrator | 2025-09-23 19:36:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:18.151198 | orchestrator | 2025-09-23 19:36:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:18.152552 | orchestrator | 2025-09-23 19:36:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:18.152604 | orchestrator | 2025-09-23 19:36:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:21.203119 | orchestrator | 2025-09-23 19:36:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:21.205686 | orchestrator | 2025-09-23 19:36:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:21.205730 | orchestrator | 2025-09-23 19:36:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:24.247294 | orchestrator | 2025-09-23 19:36:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:24.249429 | orchestrator | 2025-09-23 19:36:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:24.249461 | orchestrator | 2025-09-23 19:36:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:27.295516 | orchestrator | 2025-09-23 19:36:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:27.297589 | orchestrator | 2025-09-23 19:36:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:27.297778 | orchestrator | 2025-09-23 19:36:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:30.345105 | orchestrator | 2025-09-23 19:36:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:30.346578 | orchestrator | 2025-09-23 19:36:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:30.346895 | orchestrator | 2025-09-23 19:36:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:33.395394 | orchestrator | 2025-09-23 19:36:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:33.396477 | orchestrator | 2025-09-23 19:36:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:33.396527 | orchestrator | 2025-09-23 19:36:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:36.443029 | orchestrator | 2025-09-23 19:36:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:36.445230 | orchestrator | 2025-09-23 19:36:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:36.445372 | orchestrator | 2025-09-23 19:36:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:39.486120 | orchestrator | 2025-09-23 19:36:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:39.488148 | orchestrator | 2025-09-23 19:36:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:39.488183 | orchestrator | 2025-09-23 19:36:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:42.532630 | orchestrator | 2025-09-23 19:36:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:42.534450 | orchestrator | 2025-09-23 19:36:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:42.534491 | orchestrator | 2025-09-23 19:36:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:45.581865 | orchestrator | 2025-09-23 19:36:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:45.582743 | orchestrator | 2025-09-23 19:36:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:45.582792 | orchestrator | 2025-09-23 19:36:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:48.629102 | orchestrator | 2025-09-23 19:36:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:48.630708 | orchestrator | 2025-09-23 19:36:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:48.630736 | orchestrator | 2025-09-23 19:36:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:51.676021 | orchestrator | 2025-09-23 19:36:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:51.677891 | orchestrator | 2025-09-23 19:36:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:51.677952 | orchestrator | 2025-09-23 19:36:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:54.727739 | orchestrator | 2025-09-23 19:36:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:54.729751 | orchestrator | 2025-09-23 19:36:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:54.729804 | orchestrator | 2025-09-23 19:36:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:36:57.773709 | orchestrator | 2025-09-23 19:36:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:36:57.774838 | orchestrator | 2025-09-23 19:36:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:36:57.774869 | orchestrator | 2025-09-23 19:36:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:00.816755 | orchestrator | 2025-09-23 19:37:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:00.818382 | orchestrator | 2025-09-23 19:37:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:00.818570 | orchestrator | 2025-09-23 19:37:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:03.868091 | orchestrator | 2025-09-23 19:37:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:03.870574 | orchestrator | 2025-09-23 19:37:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:03.870611 | orchestrator | 2025-09-23 19:37:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:06.915969 | orchestrator | 2025-09-23 19:37:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:06.917775 | orchestrator | 2025-09-23 19:37:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:06.917805 | orchestrator | 2025-09-23 19:37:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:09.961833 | orchestrator | 2025-09-23 19:37:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:09.964184 | orchestrator | 2025-09-23 19:37:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:09.964216 | orchestrator | 2025-09-23 19:37:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:13.005699 | orchestrator | 2025-09-23 19:37:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:13.006421 | orchestrator | 2025-09-23 19:37:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:13.006548 | orchestrator | 2025-09-23 19:37:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:16.047809 | orchestrator | 2025-09-23 19:37:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:16.048737 | orchestrator | 2025-09-23 19:37:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:16.048756 | orchestrator | 2025-09-23 19:37:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:19.092056 | orchestrator | 2025-09-23 19:37:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:19.093866 | orchestrator | 2025-09-23 19:37:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:19.094133 | orchestrator | 2025-09-23 19:37:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:22.139735 | orchestrator | 2025-09-23 19:37:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:22.141452 | orchestrator | 2025-09-23 19:37:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:22.141716 | orchestrator | 2025-09-23 19:37:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:25.180260 | orchestrator | 2025-09-23 19:37:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:25.182627 | orchestrator | 2025-09-23 19:37:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:25.182723 | orchestrator | 2025-09-23 19:37:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:28.228037 | orchestrator | 2025-09-23 19:37:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:28.229897 | orchestrator | 2025-09-23 19:37:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:28.229923 | orchestrator | 2025-09-23 19:37:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:31.276953 | orchestrator | 2025-09-23 19:37:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:31.278619 | orchestrator | 2025-09-23 19:37:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:31.278849 | orchestrator | 2025-09-23 19:37:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:34.323758 | orchestrator | 2025-09-23 19:37:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:34.325381 | orchestrator | 2025-09-23 19:37:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:34.325424 | orchestrator | 2025-09-23 19:37:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:37.371916 | orchestrator | 2025-09-23 19:37:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:37.374743 | orchestrator | 2025-09-23 19:37:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:37.374804 | orchestrator | 2025-09-23 19:37:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:40.419233 | orchestrator | 2025-09-23 19:37:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:40.421496 | orchestrator | 2025-09-23 19:37:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:40.421565 | orchestrator | 2025-09-23 19:37:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:43.466369 | orchestrator | 2025-09-23 19:37:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:43.468005 | orchestrator | 2025-09-23 19:37:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:43.468400 | orchestrator | 2025-09-23 19:37:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:46.514624 | orchestrator | 2025-09-23 19:37:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:46.516405 | orchestrator | 2025-09-23 19:37:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:46.516693 | orchestrator | 2025-09-23 19:37:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:49.560526 | orchestrator | 2025-09-23 19:37:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:49.562004 | orchestrator | 2025-09-23 19:37:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:49.562076 | orchestrator | 2025-09-23 19:37:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:52.610083 | orchestrator | 2025-09-23 19:37:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:52.611110 | orchestrator | 2025-09-23 19:37:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:52.611205 | orchestrator | 2025-09-23 19:37:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:55.650992 | orchestrator | 2025-09-23 19:37:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:55.652160 | orchestrator | 2025-09-23 19:37:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:55.652170 | orchestrator | 2025-09-23 19:37:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:37:58.697390 | orchestrator | 2025-09-23 19:37:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:37:58.698843 | orchestrator | 2025-09-23 19:37:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:37:58.698853 | orchestrator | 2025-09-23 19:37:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:01.739916 | orchestrator | 2025-09-23 19:38:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:01.741536 | orchestrator | 2025-09-23 19:38:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:01.741673 | orchestrator | 2025-09-23 19:38:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:04.786959 | orchestrator | 2025-09-23 19:38:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:04.788389 | orchestrator | 2025-09-23 19:38:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:04.788479 | orchestrator | 2025-09-23 19:38:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:07.836377 | orchestrator | 2025-09-23 19:38:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:07.836883 | orchestrator | 2025-09-23 19:38:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:07.836916 | orchestrator | 2025-09-23 19:38:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:10.886587 | orchestrator | 2025-09-23 19:38:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:10.888692 | orchestrator | 2025-09-23 19:38:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:10.888732 | orchestrator | 2025-09-23 19:38:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:13.939655 | orchestrator | 2025-09-23 19:38:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:13.939765 | orchestrator | 2025-09-23 19:38:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:13.939781 | orchestrator | 2025-09-23 19:38:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:16.984899 | orchestrator | 2025-09-23 19:38:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:16.985444 | orchestrator | 2025-09-23 19:38:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:16.985547 | orchestrator | 2025-09-23 19:38:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:20.031396 | orchestrator | 2025-09-23 19:38:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:20.033140 | orchestrator | 2025-09-23 19:38:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:20.033208 | orchestrator | 2025-09-23 19:38:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:23.079192 | orchestrator | 2025-09-23 19:38:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:23.081980 | orchestrator | 2025-09-23 19:38:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:23.082121 | orchestrator | 2025-09-23 19:38:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:26.128434 | orchestrator | 2025-09-23 19:38:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:26.130368 | orchestrator | 2025-09-23 19:38:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:26.130397 | orchestrator | 2025-09-23 19:38:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:29.181336 | orchestrator | 2025-09-23 19:38:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:29.182847 | orchestrator | 2025-09-23 19:38:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:29.183507 | orchestrator | 2025-09-23 19:38:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:32.228658 | orchestrator | 2025-09-23 19:38:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:32.230777 | orchestrator | 2025-09-23 19:38:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:32.230928 | orchestrator | 2025-09-23 19:38:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:35.276353 | orchestrator | 2025-09-23 19:38:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:35.281422 | orchestrator | 2025-09-23 19:38:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:35.281603 | orchestrator | 2025-09-23 19:38:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:38.330085 | orchestrator | 2025-09-23 19:38:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:38.331911 | orchestrator | 2025-09-23 19:38:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:38.332142 | orchestrator | 2025-09-23 19:38:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:41.380150 | orchestrator | 2025-09-23 19:38:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:41.381942 | orchestrator | 2025-09-23 19:38:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:41.382216 | orchestrator | 2025-09-23 19:38:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:44.434767 | orchestrator | 2025-09-23 19:38:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:44.437066 | orchestrator | 2025-09-23 19:38:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:44.437106 | orchestrator | 2025-09-23 19:38:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:47.483418 | orchestrator | 2025-09-23 19:38:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:47.484349 | orchestrator | 2025-09-23 19:38:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:47.484454 | orchestrator | 2025-09-23 19:38:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:50.532476 | orchestrator | 2025-09-23 19:38:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:50.534127 | orchestrator | 2025-09-23 19:38:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:50.534600 | orchestrator | 2025-09-23 19:38:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:53.585784 | orchestrator | 2025-09-23 19:38:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:53.586837 | orchestrator | 2025-09-23 19:38:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:53.586915 | orchestrator | 2025-09-23 19:38:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:56.634937 | orchestrator | 2025-09-23 19:38:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:56.635963 | orchestrator | 2025-09-23 19:38:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:56.635999 | orchestrator | 2025-09-23 19:38:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:38:59.675791 | orchestrator | 2025-09-23 19:38:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:38:59.676822 | orchestrator | 2025-09-23 19:38:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:38:59.676867 | orchestrator | 2025-09-23 19:38:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:02.728609 | orchestrator | 2025-09-23 19:39:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:02.730604 | orchestrator | 2025-09-23 19:39:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:02.730705 | orchestrator | 2025-09-23 19:39:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:05.780999 | orchestrator | 2025-09-23 19:39:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:05.783458 | orchestrator | 2025-09-23 19:39:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:05.783549 | orchestrator | 2025-09-23 19:39:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:08.830491 | orchestrator | 2025-09-23 19:39:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:08.831097 | orchestrator | 2025-09-23 19:39:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:08.831131 | orchestrator | 2025-09-23 19:39:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:11.880641 | orchestrator | 2025-09-23 19:39:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:11.882515 | orchestrator | 2025-09-23 19:39:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:11.882604 | orchestrator | 2025-09-23 19:39:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:14.931061 | orchestrator | 2025-09-23 19:39:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:14.932509 | orchestrator | 2025-09-23 19:39:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:14.932544 | orchestrator | 2025-09-23 19:39:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:17.978931 | orchestrator | 2025-09-23 19:39:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:17.979827 | orchestrator | 2025-09-23 19:39:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:17.979872 | orchestrator | 2025-09-23 19:39:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:21.030751 | orchestrator | 2025-09-23 19:39:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:21.031811 | orchestrator | 2025-09-23 19:39:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:21.032047 | orchestrator | 2025-09-23 19:39:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:24.078745 | orchestrator | 2025-09-23 19:39:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:24.079655 | orchestrator | 2025-09-23 19:39:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:24.079711 | orchestrator | 2025-09-23 19:39:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:27.121559 | orchestrator | 2025-09-23 19:39:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:27.121977 | orchestrator | 2025-09-23 19:39:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:27.122004 | orchestrator | 2025-09-23 19:39:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:30.167196 | orchestrator | 2025-09-23 19:39:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:30.168121 | orchestrator | 2025-09-23 19:39:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:30.168192 | orchestrator | 2025-09-23 19:39:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:33.212041 | orchestrator | 2025-09-23 19:39:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:33.214343 | orchestrator | 2025-09-23 19:39:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:33.214392 | orchestrator | 2025-09-23 19:39:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:36.254075 | orchestrator | 2025-09-23 19:39:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:36.256170 | orchestrator | 2025-09-23 19:39:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:36.256202 | orchestrator | 2025-09-23 19:39:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:39.300921 | orchestrator | 2025-09-23 19:39:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:39.301629 | orchestrator | 2025-09-23 19:39:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:39.301736 | orchestrator | 2025-09-23 19:39:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:42.344135 | orchestrator | 2025-09-23 19:39:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:42.345832 | orchestrator | 2025-09-23 19:39:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:42.345881 | orchestrator | 2025-09-23 19:39:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:45.387680 | orchestrator | 2025-09-23 19:39:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:45.388951 | orchestrator | 2025-09-23 19:39:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:45.389005 | orchestrator | 2025-09-23 19:39:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:48.436536 | orchestrator | 2025-09-23 19:39:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:48.438405 | orchestrator | 2025-09-23 19:39:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:48.438469 | orchestrator | 2025-09-23 19:39:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:51.485738 | orchestrator | 2025-09-23 19:39:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:51.487934 | orchestrator | 2025-09-23 19:39:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:51.487967 | orchestrator | 2025-09-23 19:39:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:54.532904 | orchestrator | 2025-09-23 19:39:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:54.535091 | orchestrator | 2025-09-23 19:39:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:54.535125 | orchestrator | 2025-09-23 19:39:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:39:57.580406 | orchestrator | 2025-09-23 19:39:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:39:57.582398 | orchestrator | 2025-09-23 19:39:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:39:57.582499 | orchestrator | 2025-09-23 19:39:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:00.626296 | orchestrator | 2025-09-23 19:40:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:00.627299 | orchestrator | 2025-09-23 19:40:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:00.627611 | orchestrator | 2025-09-23 19:40:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:03.674724 | orchestrator | 2025-09-23 19:40:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:03.675782 | orchestrator | 2025-09-23 19:40:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:03.675900 | orchestrator | 2025-09-23 19:40:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:06.718788 | orchestrator | 2025-09-23 19:40:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:06.720595 | orchestrator | 2025-09-23 19:40:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:06.720681 | orchestrator | 2025-09-23 19:40:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:09.762504 | orchestrator | 2025-09-23 19:40:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:09.763134 | orchestrator | 2025-09-23 19:40:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:09.763330 | orchestrator | 2025-09-23 19:40:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:12.806173 | orchestrator | 2025-09-23 19:40:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:12.807102 | orchestrator | 2025-09-23 19:40:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:12.807401 | orchestrator | 2025-09-23 19:40:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:15.857945 | orchestrator | 2025-09-23 19:40:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:15.859451 | orchestrator | 2025-09-23 19:40:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:15.859489 | orchestrator | 2025-09-23 19:40:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:18.903692 | orchestrator | 2025-09-23 19:40:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:18.904992 | orchestrator | 2025-09-23 19:40:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:18.905024 | orchestrator | 2025-09-23 19:40:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:21.948619 | orchestrator | 2025-09-23 19:40:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:21.950914 | orchestrator | 2025-09-23 19:40:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:21.950975 | orchestrator | 2025-09-23 19:40:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:25.004523 | orchestrator | 2025-09-23 19:40:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:25.006733 | orchestrator | 2025-09-23 19:40:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:25.006764 | orchestrator | 2025-09-23 19:40:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:28.052785 | orchestrator | 2025-09-23 19:40:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:28.055934 | orchestrator | 2025-09-23 19:40:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:28.055987 | orchestrator | 2025-09-23 19:40:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:31.101321 | orchestrator | 2025-09-23 19:40:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:31.103033 | orchestrator | 2025-09-23 19:40:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:31.103298 | orchestrator | 2025-09-23 19:40:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:34.150524 | orchestrator | 2025-09-23 19:40:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:34.152545 | orchestrator | 2025-09-23 19:40:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:34.152577 | orchestrator | 2025-09-23 19:40:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:37.198407 | orchestrator | 2025-09-23 19:40:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:37.203568 | orchestrator | 2025-09-23 19:40:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:37.203618 | orchestrator | 2025-09-23 19:40:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:40.245164 | orchestrator | 2025-09-23 19:40:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:40.246688 | orchestrator | 2025-09-23 19:40:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:40.246742 | orchestrator | 2025-09-23 19:40:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:43.290422 | orchestrator | 2025-09-23 19:40:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:43.292234 | orchestrator | 2025-09-23 19:40:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:43.292300 | orchestrator | 2025-09-23 19:40:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:46.332466 | orchestrator | 2025-09-23 19:40:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:46.333943 | orchestrator | 2025-09-23 19:40:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:46.334005 | orchestrator | 2025-09-23 19:40:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:49.380009 | orchestrator | 2025-09-23 19:40:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:49.382209 | orchestrator | 2025-09-23 19:40:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:49.382247 | orchestrator | 2025-09-23 19:40:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:52.430548 | orchestrator | 2025-09-23 19:40:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:52.432230 | orchestrator | 2025-09-23 19:40:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:52.432295 | orchestrator | 2025-09-23 19:40:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:55.479939 | orchestrator | 2025-09-23 19:40:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:55.482002 | orchestrator | 2025-09-23 19:40:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:55.482101 | orchestrator | 2025-09-23 19:40:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:40:58.530900 | orchestrator | 2025-09-23 19:40:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:40:58.531931 | orchestrator | 2025-09-23 19:40:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:40:58.532081 | orchestrator | 2025-09-23 19:40:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:01.578580 | orchestrator | 2025-09-23 19:41:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:01.579743 | orchestrator | 2025-09-23 19:41:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:01.579832 | orchestrator | 2025-09-23 19:41:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:04.625365 | orchestrator | 2025-09-23 19:41:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:04.628197 | orchestrator | 2025-09-23 19:41:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:04.628579 | orchestrator | 2025-09-23 19:41:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:07.674618 | orchestrator | 2025-09-23 19:41:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:07.676124 | orchestrator | 2025-09-23 19:41:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:07.676159 | orchestrator | 2025-09-23 19:41:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:10.715329 | orchestrator | 2025-09-23 19:41:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:10.716706 | orchestrator | 2025-09-23 19:41:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:10.716740 | orchestrator | 2025-09-23 19:41:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:13.752944 | orchestrator | 2025-09-23 19:41:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:13.754392 | orchestrator | 2025-09-23 19:41:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:13.754438 | orchestrator | 2025-09-23 19:41:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:16.794727 | orchestrator | 2025-09-23 19:41:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:16.797321 | orchestrator | 2025-09-23 19:41:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:16.797456 | orchestrator | 2025-09-23 19:41:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:19.846357 | orchestrator | 2025-09-23 19:41:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:19.847623 | orchestrator | 2025-09-23 19:41:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:19.847742 | orchestrator | 2025-09-23 19:41:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:22.896391 | orchestrator | 2025-09-23 19:41:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:22.898249 | orchestrator | 2025-09-23 19:41:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:22.898551 | orchestrator | 2025-09-23 19:41:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:25.941400 | orchestrator | 2025-09-23 19:41:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:25.943573 | orchestrator | 2025-09-23 19:41:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:25.943661 | orchestrator | 2025-09-23 19:41:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:28.992090 | orchestrator | 2025-09-23 19:41:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:28.994357 | orchestrator | 2025-09-23 19:41:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:28.994397 | orchestrator | 2025-09-23 19:41:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:32.036744 | orchestrator | 2025-09-23 19:41:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:32.038170 | orchestrator | 2025-09-23 19:41:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:32.038234 | orchestrator | 2025-09-23 19:41:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:35.085062 | orchestrator | 2025-09-23 19:41:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:35.086732 | orchestrator | 2025-09-23 19:41:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:35.086774 | orchestrator | 2025-09-23 19:41:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:38.134865 | orchestrator | 2025-09-23 19:41:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:38.136202 | orchestrator | 2025-09-23 19:41:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:38.136487 | orchestrator | 2025-09-23 19:41:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:41.178241 | orchestrator | 2025-09-23 19:41:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:41.179205 | orchestrator | 2025-09-23 19:41:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:41.179250 | orchestrator | 2025-09-23 19:41:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:44.223567 | orchestrator | 2025-09-23 19:41:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:44.224831 | orchestrator | 2025-09-23 19:41:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:44.224861 | orchestrator | 2025-09-23 19:41:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:47.269297 | orchestrator | 2025-09-23 19:41:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:47.269493 | orchestrator | 2025-09-23 19:41:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:47.269517 | orchestrator | 2025-09-23 19:41:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:50.317813 | orchestrator | 2025-09-23 19:41:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:50.320221 | orchestrator | 2025-09-23 19:41:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:50.320343 | orchestrator | 2025-09-23 19:41:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:53.360405 | orchestrator | 2025-09-23 19:41:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:53.362887 | orchestrator | 2025-09-23 19:41:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:53.362967 | orchestrator | 2025-09-23 19:41:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:56.407614 | orchestrator | 2025-09-23 19:41:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:56.409465 | orchestrator | 2025-09-23 19:41:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:56.409544 | orchestrator | 2025-09-23 19:41:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:41:59.459299 | orchestrator | 2025-09-23 19:41:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:41:59.461146 | orchestrator | 2025-09-23 19:41:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:41:59.461204 | orchestrator | 2025-09-23 19:41:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:02.506879 | orchestrator | 2025-09-23 19:42:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:02.509944 | orchestrator | 2025-09-23 19:42:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:02.510081 | orchestrator | 2025-09-23 19:42:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:05.556347 | orchestrator | 2025-09-23 19:42:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:05.557498 | orchestrator | 2025-09-23 19:42:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:05.557649 | orchestrator | 2025-09-23 19:42:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:08.602298 | orchestrator | 2025-09-23 19:42:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:08.603397 | orchestrator | 2025-09-23 19:42:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:08.603417 | orchestrator | 2025-09-23 19:42:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:11.653044 | orchestrator | 2025-09-23 19:42:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:11.655493 | orchestrator | 2025-09-23 19:42:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:11.655617 | orchestrator | 2025-09-23 19:42:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:14.704114 | orchestrator | 2025-09-23 19:42:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:14.706500 | orchestrator | 2025-09-23 19:42:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:14.706588 | orchestrator | 2025-09-23 19:42:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:17.750898 | orchestrator | 2025-09-23 19:42:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:17.752461 | orchestrator | 2025-09-23 19:42:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:17.752516 | orchestrator | 2025-09-23 19:42:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:20.801390 | orchestrator | 2025-09-23 19:42:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:20.803910 | orchestrator | 2025-09-23 19:42:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:20.803946 | orchestrator | 2025-09-23 19:42:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:23.851871 | orchestrator | 2025-09-23 19:42:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:23.853441 | orchestrator | 2025-09-23 19:42:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:23.853486 | orchestrator | 2025-09-23 19:42:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:26.897764 | orchestrator | 2025-09-23 19:42:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:26.900141 | orchestrator | 2025-09-23 19:42:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:26.900209 | orchestrator | 2025-09-23 19:42:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:29.944836 | orchestrator | 2025-09-23 19:42:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:29.947449 | orchestrator | 2025-09-23 19:42:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:29.947510 | orchestrator | 2025-09-23 19:42:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:32.995877 | orchestrator | 2025-09-23 19:42:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:32.998138 | orchestrator | 2025-09-23 19:42:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:32.998216 | orchestrator | 2025-09-23 19:42:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:36.046725 | orchestrator | 2025-09-23 19:42:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:36.047326 | orchestrator | 2025-09-23 19:42:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:36.047795 | orchestrator | 2025-09-23 19:42:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:39.085662 | orchestrator | 2025-09-23 19:42:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:39.088217 | orchestrator | 2025-09-23 19:42:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:39.088240 | orchestrator | 2025-09-23 19:42:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:42.131902 | orchestrator | 2025-09-23 19:42:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:42.133332 | orchestrator | 2025-09-23 19:42:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:42.133368 | orchestrator | 2025-09-23 19:42:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:45.177675 | orchestrator | 2025-09-23 19:42:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:45.179097 | orchestrator | 2025-09-23 19:42:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:45.179647 | orchestrator | 2025-09-23 19:42:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:48.216450 | orchestrator | 2025-09-23 19:42:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:48.217811 | orchestrator | 2025-09-23 19:42:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:48.218149 | orchestrator | 2025-09-23 19:42:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:51.263829 | orchestrator | 2025-09-23 19:42:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:51.265747 | orchestrator | 2025-09-23 19:42:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:51.265780 | orchestrator | 2025-09-23 19:42:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:54.316315 | orchestrator | 2025-09-23 19:42:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:54.317677 | orchestrator | 2025-09-23 19:42:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:54.317826 | orchestrator | 2025-09-23 19:42:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:42:57.356855 | orchestrator | 2025-09-23 19:42:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:42:57.357081 | orchestrator | 2025-09-23 19:42:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:42:57.357132 | orchestrator | 2025-09-23 19:42:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:00.403917 | orchestrator | 2025-09-23 19:43:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:00.404944 | orchestrator | 2025-09-23 19:43:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:00.405339 | orchestrator | 2025-09-23 19:43:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:03.448861 | orchestrator | 2025-09-23 19:43:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:03.450363 | orchestrator | 2025-09-23 19:43:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:03.450462 | orchestrator | 2025-09-23 19:43:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:06.502868 | orchestrator | 2025-09-23 19:43:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:06.505072 | orchestrator | 2025-09-23 19:43:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:06.505448 | orchestrator | 2025-09-23 19:43:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:09.545781 | orchestrator | 2025-09-23 19:43:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:09.548154 | orchestrator | 2025-09-23 19:43:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:09.548188 | orchestrator | 2025-09-23 19:43:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:12.592118 | orchestrator | 2025-09-23 19:43:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:12.595849 | orchestrator | 2025-09-23 19:43:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:12.595924 | orchestrator | 2025-09-23 19:43:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:15.646156 | orchestrator | 2025-09-23 19:43:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:15.647551 | orchestrator | 2025-09-23 19:43:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:15.647857 | orchestrator | 2025-09-23 19:43:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:18.696774 | orchestrator | 2025-09-23 19:43:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:18.697589 | orchestrator | 2025-09-23 19:43:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:18.697759 | orchestrator | 2025-09-23 19:43:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:21.746338 | orchestrator | 2025-09-23 19:43:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:21.748504 | orchestrator | 2025-09-23 19:43:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:21.748665 | orchestrator | 2025-09-23 19:43:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:24.790675 | orchestrator | 2025-09-23 19:43:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:24.791241 | orchestrator | 2025-09-23 19:43:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:24.791356 | orchestrator | 2025-09-23 19:43:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:27.832760 | orchestrator | 2025-09-23 19:43:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:27.835052 | orchestrator | 2025-09-23 19:43:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:27.835137 | orchestrator | 2025-09-23 19:43:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:30.879079 | orchestrator | 2025-09-23 19:43:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:30.879678 | orchestrator | 2025-09-23 19:43:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:30.879804 | orchestrator | 2025-09-23 19:43:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:33.923653 | orchestrator | 2025-09-23 19:43:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:33.925019 | orchestrator | 2025-09-23 19:43:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:33.925078 | orchestrator | 2025-09-23 19:43:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:36.975071 | orchestrator | 2025-09-23 19:43:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:36.976808 | orchestrator | 2025-09-23 19:43:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:36.976842 | orchestrator | 2025-09-23 19:43:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:40.022679 | orchestrator | 2025-09-23 19:43:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:40.024572 | orchestrator | 2025-09-23 19:43:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:40.024771 | orchestrator | 2025-09-23 19:43:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:43.071775 | orchestrator | 2025-09-23 19:43:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:43.074604 | orchestrator | 2025-09-23 19:43:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:43.074656 | orchestrator | 2025-09-23 19:43:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:46.122974 | orchestrator | 2025-09-23 19:43:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:46.124460 | orchestrator | 2025-09-23 19:43:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:46.124509 | orchestrator | 2025-09-23 19:43:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:49.163091 | orchestrator | 2025-09-23 19:43:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:49.165102 | orchestrator | 2025-09-23 19:43:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:49.165137 | orchestrator | 2025-09-23 19:43:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:52.209314 | orchestrator | 2025-09-23 19:43:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:52.210611 | orchestrator | 2025-09-23 19:43:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:52.210645 | orchestrator | 2025-09-23 19:43:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:55.253720 | orchestrator | 2025-09-23 19:43:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:55.255749 | orchestrator | 2025-09-23 19:43:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:55.255827 | orchestrator | 2025-09-23 19:43:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:43:58.305744 | orchestrator | 2025-09-23 19:43:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:43:58.307229 | orchestrator | 2025-09-23 19:43:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:43:58.307492 | orchestrator | 2025-09-23 19:43:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:01.355797 | orchestrator | 2025-09-23 19:44:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:01.357167 | orchestrator | 2025-09-23 19:44:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:01.357325 | orchestrator | 2025-09-23 19:44:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:04.400515 | orchestrator | 2025-09-23 19:44:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:04.402102 | orchestrator | 2025-09-23 19:44:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:04.402136 | orchestrator | 2025-09-23 19:44:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:07.455177 | orchestrator | 2025-09-23 19:44:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:07.456872 | orchestrator | 2025-09-23 19:44:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:07.457019 | orchestrator | 2025-09-23 19:44:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:10.501966 | orchestrator | 2025-09-23 19:44:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:10.502952 | orchestrator | 2025-09-23 19:44:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:10.502985 | orchestrator | 2025-09-23 19:44:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:13.543542 | orchestrator | 2025-09-23 19:44:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:13.544121 | orchestrator | 2025-09-23 19:44:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:13.544767 | orchestrator | 2025-09-23 19:44:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:16.590434 | orchestrator | 2025-09-23 19:44:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:16.594864 | orchestrator | 2025-09-23 19:44:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:16.594922 | orchestrator | 2025-09-23 19:44:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:19.632907 | orchestrator | 2025-09-23 19:44:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:19.635518 | orchestrator | 2025-09-23 19:44:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:19.635557 | orchestrator | 2025-09-23 19:44:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:22.679541 | orchestrator | 2025-09-23 19:44:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:22.680895 | orchestrator | 2025-09-23 19:44:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:22.681007 | orchestrator | 2025-09-23 19:44:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:25.727507 | orchestrator | 2025-09-23 19:44:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:25.729093 | orchestrator | 2025-09-23 19:44:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:25.729536 | orchestrator | 2025-09-23 19:44:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:28.770423 | orchestrator | 2025-09-23 19:44:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:28.772610 | orchestrator | 2025-09-23 19:44:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:28.772684 | orchestrator | 2025-09-23 19:44:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:31.815495 | orchestrator | 2025-09-23 19:44:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:31.817575 | orchestrator | 2025-09-23 19:44:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:31.817914 | orchestrator | 2025-09-23 19:44:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:34.866487 | orchestrator | 2025-09-23 19:44:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:34.867896 | orchestrator | 2025-09-23 19:44:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:34.867929 | orchestrator | 2025-09-23 19:44:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:37.906900 | orchestrator | 2025-09-23 19:44:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:37.908053 | orchestrator | 2025-09-23 19:44:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:37.908104 | orchestrator | 2025-09-23 19:44:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:40.955645 | orchestrator | 2025-09-23 19:44:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:40.957139 | orchestrator | 2025-09-23 19:44:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:40.957220 | orchestrator | 2025-09-23 19:44:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:44.002416 | orchestrator | 2025-09-23 19:44:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:44.003461 | orchestrator | 2025-09-23 19:44:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:44.003493 | orchestrator | 2025-09-23 19:44:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:47.047116 | orchestrator | 2025-09-23 19:44:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:47.048276 | orchestrator | 2025-09-23 19:44:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:47.048332 | orchestrator | 2025-09-23 19:44:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:50.093718 | orchestrator | 2025-09-23 19:44:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:50.094744 | orchestrator | 2025-09-23 19:44:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:50.094791 | orchestrator | 2025-09-23 19:44:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:53.138964 | orchestrator | 2025-09-23 19:44:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:53.140970 | orchestrator | 2025-09-23 19:44:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:53.141114 | orchestrator | 2025-09-23 19:44:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:56.193637 | orchestrator | 2025-09-23 19:44:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:56.195137 | orchestrator | 2025-09-23 19:44:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:56.195420 | orchestrator | 2025-09-23 19:44:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:44:59.244677 | orchestrator | 2025-09-23 19:44:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:44:59.247000 | orchestrator | 2025-09-23 19:44:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:44:59.247042 | orchestrator | 2025-09-23 19:44:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:02.293904 | orchestrator | 2025-09-23 19:45:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:02.295794 | orchestrator | 2025-09-23 19:45:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:02.296291 | orchestrator | 2025-09-23 19:45:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:05.344058 | orchestrator | 2025-09-23 19:45:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:05.346720 | orchestrator | 2025-09-23 19:45:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:05.346903 | orchestrator | 2025-09-23 19:45:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:08.390914 | orchestrator | 2025-09-23 19:45:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:08.393127 | orchestrator | 2025-09-23 19:45:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:08.393160 | orchestrator | 2025-09-23 19:45:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:11.437053 | orchestrator | 2025-09-23 19:45:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:11.438489 | orchestrator | 2025-09-23 19:45:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:11.438521 | orchestrator | 2025-09-23 19:45:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:14.478751 | orchestrator | 2025-09-23 19:45:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:14.480160 | orchestrator | 2025-09-23 19:45:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:14.480191 | orchestrator | 2025-09-23 19:45:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:17.520714 | orchestrator | 2025-09-23 19:45:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:17.522115 | orchestrator | 2025-09-23 19:45:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:17.522219 | orchestrator | 2025-09-23 19:45:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:20.565931 | orchestrator | 2025-09-23 19:45:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:20.567439 | orchestrator | 2025-09-23 19:45:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:20.567565 | orchestrator | 2025-09-23 19:45:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:23.613545 | orchestrator | 2025-09-23 19:45:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:23.616568 | orchestrator | 2025-09-23 19:45:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:23.616648 | orchestrator | 2025-09-23 19:45:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:26.664892 | orchestrator | 2025-09-23 19:45:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:26.666561 | orchestrator | 2025-09-23 19:45:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:26.666593 | orchestrator | 2025-09-23 19:45:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:29.709609 | orchestrator | 2025-09-23 19:45:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:29.710957 | orchestrator | 2025-09-23 19:45:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:29.711160 | orchestrator | 2025-09-23 19:45:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:32.760361 | orchestrator | 2025-09-23 19:45:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:32.762974 | orchestrator | 2025-09-23 19:45:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:32.763137 | orchestrator | 2025-09-23 19:45:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:35.810159 | orchestrator | 2025-09-23 19:45:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:35.812521 | orchestrator | 2025-09-23 19:45:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:35.812581 | orchestrator | 2025-09-23 19:45:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:38.857023 | orchestrator | 2025-09-23 19:45:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:38.858469 | orchestrator | 2025-09-23 19:45:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:38.858874 | orchestrator | 2025-09-23 19:45:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:41.905447 | orchestrator | 2025-09-23 19:45:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:41.908002 | orchestrator | 2025-09-23 19:45:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:41.908044 | orchestrator | 2025-09-23 19:45:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:44.950765 | orchestrator | 2025-09-23 19:45:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:44.952726 | orchestrator | 2025-09-23 19:45:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:44.952770 | orchestrator | 2025-09-23 19:45:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:47.996192 | orchestrator | 2025-09-23 19:45:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:47.997638 | orchestrator | 2025-09-23 19:45:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:47.997715 | orchestrator | 2025-09-23 19:45:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:51.040718 | orchestrator | 2025-09-23 19:45:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:51.041681 | orchestrator | 2025-09-23 19:45:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:51.041718 | orchestrator | 2025-09-23 19:45:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:54.094741 | orchestrator | 2025-09-23 19:45:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:54.096330 | orchestrator | 2025-09-23 19:45:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:54.096388 | orchestrator | 2025-09-23 19:45:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:45:57.143521 | orchestrator | 2025-09-23 19:45:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:45:57.146012 | orchestrator | 2025-09-23 19:45:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:45:57.146088 | orchestrator | 2025-09-23 19:45:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:00.189596 | orchestrator | 2025-09-23 19:46:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:00.191447 | orchestrator | 2025-09-23 19:46:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:00.191526 | orchestrator | 2025-09-23 19:46:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:03.236502 | orchestrator | 2025-09-23 19:46:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:03.238790 | orchestrator | 2025-09-23 19:46:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:03.239046 | orchestrator | 2025-09-23 19:46:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:06.275732 | orchestrator | 2025-09-23 19:46:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:06.276897 | orchestrator | 2025-09-23 19:46:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:06.276942 | orchestrator | 2025-09-23 19:46:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:09.325386 | orchestrator | 2025-09-23 19:46:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:09.327269 | orchestrator | 2025-09-23 19:46:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:09.327344 | orchestrator | 2025-09-23 19:46:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:12.373511 | orchestrator | 2025-09-23 19:46:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:12.375371 | orchestrator | 2025-09-23 19:46:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:12.375442 | orchestrator | 2025-09-23 19:46:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:15.420063 | orchestrator | 2025-09-23 19:46:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:15.420880 | orchestrator | 2025-09-23 19:46:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:15.420990 | orchestrator | 2025-09-23 19:46:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:18.473796 | orchestrator | 2025-09-23 19:46:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:18.475189 | orchestrator | 2025-09-23 19:46:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:18.475437 | orchestrator | 2025-09-23 19:46:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:21.527551 | orchestrator | 2025-09-23 19:46:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:21.529352 | orchestrator | 2025-09-23 19:46:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:21.529382 | orchestrator | 2025-09-23 19:46:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:24.575029 | orchestrator | 2025-09-23 19:46:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:24.576715 | orchestrator | 2025-09-23 19:46:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:24.576747 | orchestrator | 2025-09-23 19:46:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:27.620177 | orchestrator | 2025-09-23 19:46:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:27.623035 | orchestrator | 2025-09-23 19:46:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:27.623083 | orchestrator | 2025-09-23 19:46:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:30.669543 | orchestrator | 2025-09-23 19:46:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:30.670813 | orchestrator | 2025-09-23 19:46:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:30.670857 | orchestrator | 2025-09-23 19:46:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:33.718000 | orchestrator | 2025-09-23 19:46:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:33.719120 | orchestrator | 2025-09-23 19:46:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:33.719325 | orchestrator | 2025-09-23 19:46:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:36.762549 | orchestrator | 2025-09-23 19:46:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:36.763628 | orchestrator | 2025-09-23 19:46:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:36.764079 | orchestrator | 2025-09-23 19:46:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:39.813852 | orchestrator | 2025-09-23 19:46:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:39.815997 | orchestrator | 2025-09-23 19:46:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:39.816253 | orchestrator | 2025-09-23 19:46:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:42.862296 | orchestrator | 2025-09-23 19:46:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:42.864655 | orchestrator | 2025-09-23 19:46:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:42.864676 | orchestrator | 2025-09-23 19:46:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:45.908568 | orchestrator | 2025-09-23 19:46:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:45.909737 | orchestrator | 2025-09-23 19:46:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:45.909766 | orchestrator | 2025-09-23 19:46:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:48.953399 | orchestrator | 2025-09-23 19:46:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:48.954746 | orchestrator | 2025-09-23 19:46:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:48.954800 | orchestrator | 2025-09-23 19:46:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:51.999754 | orchestrator | 2025-09-23 19:46:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:52.002098 | orchestrator | 2025-09-23 19:46:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:52.002159 | orchestrator | 2025-09-23 19:46:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:55.050479 | orchestrator | 2025-09-23 19:46:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:55.051396 | orchestrator | 2025-09-23 19:46:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:55.051441 | orchestrator | 2025-09-23 19:46:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:46:58.089441 | orchestrator | 2025-09-23 19:46:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:46:58.091554 | orchestrator | 2025-09-23 19:46:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:46:58.091605 | orchestrator | 2025-09-23 19:46:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:01.136540 | orchestrator | 2025-09-23 19:47:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:01.138119 | orchestrator | 2025-09-23 19:47:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:01.138171 | orchestrator | 2025-09-23 19:47:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:04.182972 | orchestrator | 2025-09-23 19:47:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:04.186062 | orchestrator | 2025-09-23 19:47:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:04.186093 | orchestrator | 2025-09-23 19:47:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:07.232544 | orchestrator | 2025-09-23 19:47:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:07.233525 | orchestrator | 2025-09-23 19:47:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:07.233613 | orchestrator | 2025-09-23 19:47:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:10.276277 | orchestrator | 2025-09-23 19:47:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:10.278241 | orchestrator | 2025-09-23 19:47:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:10.278292 | orchestrator | 2025-09-23 19:47:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:13.321552 | orchestrator | 2025-09-23 19:47:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:13.323870 | orchestrator | 2025-09-23 19:47:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:13.323912 | orchestrator | 2025-09-23 19:47:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:16.366415 | orchestrator | 2025-09-23 19:47:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:16.368553 | orchestrator | 2025-09-23 19:47:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:16.368614 | orchestrator | 2025-09-23 19:47:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:19.409437 | orchestrator | 2025-09-23 19:47:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:19.410938 | orchestrator | 2025-09-23 19:47:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:19.411045 | orchestrator | 2025-09-23 19:47:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:22.457073 | orchestrator | 2025-09-23 19:47:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:22.458148 | orchestrator | 2025-09-23 19:47:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:22.458351 | orchestrator | 2025-09-23 19:47:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:25.501898 | orchestrator | 2025-09-23 19:47:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:25.503281 | orchestrator | 2025-09-23 19:47:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:25.503312 | orchestrator | 2025-09-23 19:47:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:28.551171 | orchestrator | 2025-09-23 19:47:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:28.552410 | orchestrator | 2025-09-23 19:47:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:28.552529 | orchestrator | 2025-09-23 19:47:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:31.597382 | orchestrator | 2025-09-23 19:47:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:31.599493 | orchestrator | 2025-09-23 19:47:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:31.599523 | orchestrator | 2025-09-23 19:47:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:34.643730 | orchestrator | 2025-09-23 19:47:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:34.645122 | orchestrator | 2025-09-23 19:47:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:34.645216 | orchestrator | 2025-09-23 19:47:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:37.689003 | orchestrator | 2025-09-23 19:47:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:37.690902 | orchestrator | 2025-09-23 19:47:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:37.690956 | orchestrator | 2025-09-23 19:47:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:40.734074 | orchestrator | 2025-09-23 19:47:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:40.736549 | orchestrator | 2025-09-23 19:47:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:40.736579 | orchestrator | 2025-09-23 19:47:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:43.773651 | orchestrator | 2025-09-23 19:47:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:43.774135 | orchestrator | 2025-09-23 19:47:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:43.774327 | orchestrator | 2025-09-23 19:47:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:46.820740 | orchestrator | 2025-09-23 19:47:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:46.822533 | orchestrator | 2025-09-23 19:47:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:46.822614 | orchestrator | 2025-09-23 19:47:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:49.869468 | orchestrator | 2025-09-23 19:47:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:49.871459 | orchestrator | 2025-09-23 19:47:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:49.871574 | orchestrator | 2025-09-23 19:47:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:52.919504 | orchestrator | 2025-09-23 19:47:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:52.920983 | orchestrator | 2025-09-23 19:47:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:52.921009 | orchestrator | 2025-09-23 19:47:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:55.959348 | orchestrator | 2025-09-23 19:47:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:55.960489 | orchestrator | 2025-09-23 19:47:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:55.960658 | orchestrator | 2025-09-23 19:47:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:47:59.006478 | orchestrator | 2025-09-23 19:47:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:47:59.007961 | orchestrator | 2025-09-23 19:47:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:47:59.008008 | orchestrator | 2025-09-23 19:47:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:02.043232 | orchestrator | 2025-09-23 19:48:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:02.044714 | orchestrator | 2025-09-23 19:48:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:02.044747 | orchestrator | 2025-09-23 19:48:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:05.092561 | orchestrator | 2025-09-23 19:48:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:05.094120 | orchestrator | 2025-09-23 19:48:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:05.094254 | orchestrator | 2025-09-23 19:48:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:08.144409 | orchestrator | 2025-09-23 19:48:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:08.146440 | orchestrator | 2025-09-23 19:48:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:08.146636 | orchestrator | 2025-09-23 19:48:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:11.193792 | orchestrator | 2025-09-23 19:48:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:11.195795 | orchestrator | 2025-09-23 19:48:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:11.195843 | orchestrator | 2025-09-23 19:48:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:14.240487 | orchestrator | 2025-09-23 19:48:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:14.243335 | orchestrator | 2025-09-23 19:48:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:14.243869 | orchestrator | 2025-09-23 19:48:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:17.289554 | orchestrator | 2025-09-23 19:48:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:17.291177 | orchestrator | 2025-09-23 19:48:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:17.291272 | orchestrator | 2025-09-23 19:48:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:20.332509 | orchestrator | 2025-09-23 19:48:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:20.334260 | orchestrator | 2025-09-23 19:48:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:20.334297 | orchestrator | 2025-09-23 19:48:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:23.382982 | orchestrator | 2025-09-23 19:48:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:23.385314 | orchestrator | 2025-09-23 19:48:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:23.385359 | orchestrator | 2025-09-23 19:48:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:26.428653 | orchestrator | 2025-09-23 19:48:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:26.430837 | orchestrator | 2025-09-23 19:48:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:26.430879 | orchestrator | 2025-09-23 19:48:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:29.480377 | orchestrator | 2025-09-23 19:48:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:29.483320 | orchestrator | 2025-09-23 19:48:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:29.483376 | orchestrator | 2025-09-23 19:48:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:32.529784 | orchestrator | 2025-09-23 19:48:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:32.531415 | orchestrator | 2025-09-23 19:48:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:32.531445 | orchestrator | 2025-09-23 19:48:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:35.576262 | orchestrator | 2025-09-23 19:48:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:35.577932 | orchestrator | 2025-09-23 19:48:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:35.578071 | orchestrator | 2025-09-23 19:48:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:38.628243 | orchestrator | 2025-09-23 19:48:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:38.629087 | orchestrator | 2025-09-23 19:48:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:38.629279 | orchestrator | 2025-09-23 19:48:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:41.674757 | orchestrator | 2025-09-23 19:48:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:41.677391 | orchestrator | 2025-09-23 19:48:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:41.677443 | orchestrator | 2025-09-23 19:48:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:44.724348 | orchestrator | 2025-09-23 19:48:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:44.725687 | orchestrator | 2025-09-23 19:48:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:44.725933 | orchestrator | 2025-09-23 19:48:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:47.775262 | orchestrator | 2025-09-23 19:48:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:47.776778 | orchestrator | 2025-09-23 19:48:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:47.776824 | orchestrator | 2025-09-23 19:48:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:50.825643 | orchestrator | 2025-09-23 19:48:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:50.827499 | orchestrator | 2025-09-23 19:48:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:50.827583 | orchestrator | 2025-09-23 19:48:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:53.878928 | orchestrator | 2025-09-23 19:48:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:53.880962 | orchestrator | 2025-09-23 19:48:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:53.880995 | orchestrator | 2025-09-23 19:48:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:56.922451 | orchestrator | 2025-09-23 19:48:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:56.924125 | orchestrator | 2025-09-23 19:48:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:56.924161 | orchestrator | 2025-09-23 19:48:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:48:59.970939 | orchestrator | 2025-09-23 19:48:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:48:59.973101 | orchestrator | 2025-09-23 19:48:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:48:59.973379 | orchestrator | 2025-09-23 19:48:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:03.020417 | orchestrator | 2025-09-23 19:49:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:03.022434 | orchestrator | 2025-09-23 19:49:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:03.022556 | orchestrator | 2025-09-23 19:49:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:06.070171 | orchestrator | 2025-09-23 19:49:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:06.071409 | orchestrator | 2025-09-23 19:49:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:06.071520 | orchestrator | 2025-09-23 19:49:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:09.121606 | orchestrator | 2025-09-23 19:49:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:09.123360 | orchestrator | 2025-09-23 19:49:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:09.123466 | orchestrator | 2025-09-23 19:49:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:12.166000 | orchestrator | 2025-09-23 19:49:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:12.167292 | orchestrator | 2025-09-23 19:49:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:12.167344 | orchestrator | 2025-09-23 19:49:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:15.204580 | orchestrator | 2025-09-23 19:49:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:15.206947 | orchestrator | 2025-09-23 19:49:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:15.206981 | orchestrator | 2025-09-23 19:49:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:18.249682 | orchestrator | 2025-09-23 19:49:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:18.251270 | orchestrator | 2025-09-23 19:49:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:18.251306 | orchestrator | 2025-09-23 19:49:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:21.292159 | orchestrator | 2025-09-23 19:49:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:21.294556 | orchestrator | 2025-09-23 19:49:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:21.294591 | orchestrator | 2025-09-23 19:49:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:24.337274 | orchestrator | 2025-09-23 19:49:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:24.338503 | orchestrator | 2025-09-23 19:49:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:24.338545 | orchestrator | 2025-09-23 19:49:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:27.383885 | orchestrator | 2025-09-23 19:49:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:27.385887 | orchestrator | 2025-09-23 19:49:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:27.385925 | orchestrator | 2025-09-23 19:49:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:30.429753 | orchestrator | 2025-09-23 19:49:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:30.431586 | orchestrator | 2025-09-23 19:49:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:30.431632 | orchestrator | 2025-09-23 19:49:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:33.477491 | orchestrator | 2025-09-23 19:49:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:33.479462 | orchestrator | 2025-09-23 19:49:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:33.479575 | orchestrator | 2025-09-23 19:49:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:36.526518 | orchestrator | 2025-09-23 19:49:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:36.528420 | orchestrator | 2025-09-23 19:49:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:36.528452 | orchestrator | 2025-09-23 19:49:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:39.574256 | orchestrator | 2025-09-23 19:49:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:39.576770 | orchestrator | 2025-09-23 19:49:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:39.576821 | orchestrator | 2025-09-23 19:49:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:42.623093 | orchestrator | 2025-09-23 19:49:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:42.624447 | orchestrator | 2025-09-23 19:49:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:42.624480 | orchestrator | 2025-09-23 19:49:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:45.669340 | orchestrator | 2025-09-23 19:49:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:45.671086 | orchestrator | 2025-09-23 19:49:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:45.671143 | orchestrator | 2025-09-23 19:49:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:48.714587 | orchestrator | 2025-09-23 19:49:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:48.716155 | orchestrator | 2025-09-23 19:49:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:48.716215 | orchestrator | 2025-09-23 19:49:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:51.764810 | orchestrator | 2025-09-23 19:49:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:51.765376 | orchestrator | 2025-09-23 19:49:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:51.765997 | orchestrator | 2025-09-23 19:49:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:54.810812 | orchestrator | 2025-09-23 19:49:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:54.810906 | orchestrator | 2025-09-23 19:49:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:54.810984 | orchestrator | 2025-09-23 19:49:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:49:57.859105 | orchestrator | 2025-09-23 19:49:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:49:57.860315 | orchestrator | 2025-09-23 19:49:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:49:57.860347 | orchestrator | 2025-09-23 19:49:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:00.904138 | orchestrator | 2025-09-23 19:50:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:00.905874 | orchestrator | 2025-09-23 19:50:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:00.905905 | orchestrator | 2025-09-23 19:50:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:03.960598 | orchestrator | 2025-09-23 19:50:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:03.962706 | orchestrator | 2025-09-23 19:50:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:03.962741 | orchestrator | 2025-09-23 19:50:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:07.028305 | orchestrator | 2025-09-23 19:50:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:07.029950 | orchestrator | 2025-09-23 19:50:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:07.029996 | orchestrator | 2025-09-23 19:50:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:10.075985 | orchestrator | 2025-09-23 19:50:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:10.077374 | orchestrator | 2025-09-23 19:50:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:10.077421 | orchestrator | 2025-09-23 19:50:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:13.129431 | orchestrator | 2025-09-23 19:50:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:13.130593 | orchestrator | 2025-09-23 19:50:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:13.130854 | orchestrator | 2025-09-23 19:50:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:16.181636 | orchestrator | 2025-09-23 19:50:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:16.183415 | orchestrator | 2025-09-23 19:50:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:16.183470 | orchestrator | 2025-09-23 19:50:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:19.233481 | orchestrator | 2025-09-23 19:50:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:19.237580 | orchestrator | 2025-09-23 19:50:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:19.237716 | orchestrator | 2025-09-23 19:50:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:22.279057 | orchestrator | 2025-09-23 19:50:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:22.280849 | orchestrator | 2025-09-23 19:50:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:22.280883 | orchestrator | 2025-09-23 19:50:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:25.324327 | orchestrator | 2025-09-23 19:50:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:25.326133 | orchestrator | 2025-09-23 19:50:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:25.326152 | orchestrator | 2025-09-23 19:50:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:28.371251 | orchestrator | 2025-09-23 19:50:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:28.372542 | orchestrator | 2025-09-23 19:50:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:28.372573 | orchestrator | 2025-09-23 19:50:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:31.417846 | orchestrator | 2025-09-23 19:50:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:31.419984 | orchestrator | 2025-09-23 19:50:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:31.420019 | orchestrator | 2025-09-23 19:50:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:34.464691 | orchestrator | 2025-09-23 19:50:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:34.466236 | orchestrator | 2025-09-23 19:50:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:34.466518 | orchestrator | 2025-09-23 19:50:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:37.516774 | orchestrator | 2025-09-23 19:50:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:37.518428 | orchestrator | 2025-09-23 19:50:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:37.518464 | orchestrator | 2025-09-23 19:50:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:40.565132 | orchestrator | 2025-09-23 19:50:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:40.566342 | orchestrator | 2025-09-23 19:50:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:40.566468 | orchestrator | 2025-09-23 19:50:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:43.618367 | orchestrator | 2025-09-23 19:50:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:43.620502 | orchestrator | 2025-09-23 19:50:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:43.620553 | orchestrator | 2025-09-23 19:50:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:46.667051 | orchestrator | 2025-09-23 19:50:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:46.667860 | orchestrator | 2025-09-23 19:50:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:46.668244 | orchestrator | 2025-09-23 19:50:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:49.712612 | orchestrator | 2025-09-23 19:50:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:49.714802 | orchestrator | 2025-09-23 19:50:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:49.714826 | orchestrator | 2025-09-23 19:50:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:52.758108 | orchestrator | 2025-09-23 19:50:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:52.760043 | orchestrator | 2025-09-23 19:50:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:52.760160 | orchestrator | 2025-09-23 19:50:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:55.803691 | orchestrator | 2025-09-23 19:50:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:55.806131 | orchestrator | 2025-09-23 19:50:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:55.806168 | orchestrator | 2025-09-23 19:50:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:50:58.851701 | orchestrator | 2025-09-23 19:50:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:50:58.854469 | orchestrator | 2025-09-23 19:50:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:50:58.854594 | orchestrator | 2025-09-23 19:50:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:01.903037 | orchestrator | 2025-09-23 19:51:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:01.905041 | orchestrator | 2025-09-23 19:51:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:01.905124 | orchestrator | 2025-09-23 19:51:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:04.952760 | orchestrator | 2025-09-23 19:51:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:04.954214 | orchestrator | 2025-09-23 19:51:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:04.954267 | orchestrator | 2025-09-23 19:51:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:08.003101 | orchestrator | 2025-09-23 19:51:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:08.004664 | orchestrator | 2025-09-23 19:51:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:08.004699 | orchestrator | 2025-09-23 19:51:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:11.051171 | orchestrator | 2025-09-23 19:51:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:11.051889 | orchestrator | 2025-09-23 19:51:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:11.051925 | orchestrator | 2025-09-23 19:51:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:14.098592 | orchestrator | 2025-09-23 19:51:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:14.099862 | orchestrator | 2025-09-23 19:51:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:14.099916 | orchestrator | 2025-09-23 19:51:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:17.144991 | orchestrator | 2025-09-23 19:51:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:17.146596 | orchestrator | 2025-09-23 19:51:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:17.146633 | orchestrator | 2025-09-23 19:51:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:20.190422 | orchestrator | 2025-09-23 19:51:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:20.191472 | orchestrator | 2025-09-23 19:51:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:20.191546 | orchestrator | 2025-09-23 19:51:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:23.238163 | orchestrator | 2025-09-23 19:51:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:23.238485 | orchestrator | 2025-09-23 19:51:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:23.238511 | orchestrator | 2025-09-23 19:51:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:26.283684 | orchestrator | 2025-09-23 19:51:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:26.285335 | orchestrator | 2025-09-23 19:51:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:26.285628 | orchestrator | 2025-09-23 19:51:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:29.331657 | orchestrator | 2025-09-23 19:51:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:29.332953 | orchestrator | 2025-09-23 19:51:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:29.332987 | orchestrator | 2025-09-23 19:51:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:32.378786 | orchestrator | 2025-09-23 19:51:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:32.379939 | orchestrator | 2025-09-23 19:51:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:32.380169 | orchestrator | 2025-09-23 19:51:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:35.425077 | orchestrator | 2025-09-23 19:51:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:35.425538 | orchestrator | 2025-09-23 19:51:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:35.425575 | orchestrator | 2025-09-23 19:51:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:38.473530 | orchestrator | 2025-09-23 19:51:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:38.476287 | orchestrator | 2025-09-23 19:51:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:38.476391 | orchestrator | 2025-09-23 19:51:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:41.520790 | orchestrator | 2025-09-23 19:51:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:41.521826 | orchestrator | 2025-09-23 19:51:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:41.521866 | orchestrator | 2025-09-23 19:51:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:44.573719 | orchestrator | 2025-09-23 19:51:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:44.575625 | orchestrator | 2025-09-23 19:51:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:44.575656 | orchestrator | 2025-09-23 19:51:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:47.617617 | orchestrator | 2025-09-23 19:51:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:47.619908 | orchestrator | 2025-09-23 19:51:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:47.619981 | orchestrator | 2025-09-23 19:51:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:50.663069 | orchestrator | 2025-09-23 19:51:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:50.665639 | orchestrator | 2025-09-23 19:51:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:50.665701 | orchestrator | 2025-09-23 19:51:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:53.712401 | orchestrator | 2025-09-23 19:51:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:53.713885 | orchestrator | 2025-09-23 19:51:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:53.713912 | orchestrator | 2025-09-23 19:51:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:56.756763 | orchestrator | 2025-09-23 19:51:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:56.760086 | orchestrator | 2025-09-23 19:51:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:56.760159 | orchestrator | 2025-09-23 19:51:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:51:59.813477 | orchestrator | 2025-09-23 19:51:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:51:59.815009 | orchestrator | 2025-09-23 19:51:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:51:59.815516 | orchestrator | 2025-09-23 19:51:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:02.866707 | orchestrator | 2025-09-23 19:52:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:02.867885 | orchestrator | 2025-09-23 19:52:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:02.867917 | orchestrator | 2025-09-23 19:52:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:05.914275 | orchestrator | 2025-09-23 19:52:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:05.915821 | orchestrator | 2025-09-23 19:52:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:05.915843 | orchestrator | 2025-09-23 19:52:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:08.965256 | orchestrator | 2025-09-23 19:52:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:08.966848 | orchestrator | 2025-09-23 19:52:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:08.966919 | orchestrator | 2025-09-23 19:52:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:12.019457 | orchestrator | 2025-09-23 19:52:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:12.021405 | orchestrator | 2025-09-23 19:52:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:12.021479 | orchestrator | 2025-09-23 19:52:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:15.069117 | orchestrator | 2025-09-23 19:52:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:15.071486 | orchestrator | 2025-09-23 19:52:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:15.071538 | orchestrator | 2025-09-23 19:52:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:18.110510 | orchestrator | 2025-09-23 19:52:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:18.112036 | orchestrator | 2025-09-23 19:52:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:18.112068 | orchestrator | 2025-09-23 19:52:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:21.159630 | orchestrator | 2025-09-23 19:52:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:21.160194 | orchestrator | 2025-09-23 19:52:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:21.160223 | orchestrator | 2025-09-23 19:52:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:24.204845 | orchestrator | 2025-09-23 19:52:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:24.206092 | orchestrator | 2025-09-23 19:52:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:24.206158 | orchestrator | 2025-09-23 19:52:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:27.251960 | orchestrator | 2025-09-23 19:52:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:27.253680 | orchestrator | 2025-09-23 19:52:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:27.253751 | orchestrator | 2025-09-23 19:52:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:30.298301 | orchestrator | 2025-09-23 19:52:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:30.299476 | orchestrator | 2025-09-23 19:52:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:30.299546 | orchestrator | 2025-09-23 19:52:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:33.349585 | orchestrator | 2025-09-23 19:52:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:33.350694 | orchestrator | 2025-09-23 19:52:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:33.350724 | orchestrator | 2025-09-23 19:52:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:36.398572 | orchestrator | 2025-09-23 19:52:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:36.399676 | orchestrator | 2025-09-23 19:52:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:36.399708 | orchestrator | 2025-09-23 19:52:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:39.453641 | orchestrator | 2025-09-23 19:52:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:39.454763 | orchestrator | 2025-09-23 19:52:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:39.454798 | orchestrator | 2025-09-23 19:52:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:42.500499 | orchestrator | 2025-09-23 19:52:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:42.502360 | orchestrator | 2025-09-23 19:52:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:42.502390 | orchestrator | 2025-09-23 19:52:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:45.561905 | orchestrator | 2025-09-23 19:52:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:45.563328 | orchestrator | 2025-09-23 19:52:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:45.563369 | orchestrator | 2025-09-23 19:52:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:48.610849 | orchestrator | 2025-09-23 19:52:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:48.613509 | orchestrator | 2025-09-23 19:52:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:48.613552 | orchestrator | 2025-09-23 19:52:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:51.655810 | orchestrator | 2025-09-23 19:52:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:51.657046 | orchestrator | 2025-09-23 19:52:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:51.657080 | orchestrator | 2025-09-23 19:52:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:54.703506 | orchestrator | 2025-09-23 19:52:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:54.705108 | orchestrator | 2025-09-23 19:52:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:54.705212 | orchestrator | 2025-09-23 19:52:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:52:57.753325 | orchestrator | 2025-09-23 19:52:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:52:57.754920 | orchestrator | 2025-09-23 19:52:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:52:57.755068 | orchestrator | 2025-09-23 19:52:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:00.796054 | orchestrator | 2025-09-23 19:53:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:00.798011 | orchestrator | 2025-09-23 19:53:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:00.798144 | orchestrator | 2025-09-23 19:53:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:03.842354 | orchestrator | 2025-09-23 19:53:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:03.844952 | orchestrator | 2025-09-23 19:53:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:03.845034 | orchestrator | 2025-09-23 19:53:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:06.889426 | orchestrator | 2025-09-23 19:53:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:06.890550 | orchestrator | 2025-09-23 19:53:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:06.890639 | orchestrator | 2025-09-23 19:53:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:09.936416 | orchestrator | 2025-09-23 19:53:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:09.937551 | orchestrator | 2025-09-23 19:53:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:09.937762 | orchestrator | 2025-09-23 19:53:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:12.975352 | orchestrator | 2025-09-23 19:53:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:12.977421 | orchestrator | 2025-09-23 19:53:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:12.977503 | orchestrator | 2025-09-23 19:53:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:16.020473 | orchestrator | 2025-09-23 19:53:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:16.022627 | orchestrator | 2025-09-23 19:53:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:16.022664 | orchestrator | 2025-09-23 19:53:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:19.068210 | orchestrator | 2025-09-23 19:53:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:19.070122 | orchestrator | 2025-09-23 19:53:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:19.070532 | orchestrator | 2025-09-23 19:53:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:22.117786 | orchestrator | 2025-09-23 19:53:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:22.119686 | orchestrator | 2025-09-23 19:53:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:22.119761 | orchestrator | 2025-09-23 19:53:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:25.166801 | orchestrator | 2025-09-23 19:53:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:25.168671 | orchestrator | 2025-09-23 19:53:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:25.168997 | orchestrator | 2025-09-23 19:53:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:28.212572 | orchestrator | 2025-09-23 19:53:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:28.214555 | orchestrator | 2025-09-23 19:53:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:28.214588 | orchestrator | 2025-09-23 19:53:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:31.263245 | orchestrator | 2025-09-23 19:53:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:31.265886 | orchestrator | 2025-09-23 19:53:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:31.266338 | orchestrator | 2025-09-23 19:53:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:34.317808 | orchestrator | 2025-09-23 19:53:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:34.319824 | orchestrator | 2025-09-23 19:53:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:34.320046 | orchestrator | 2025-09-23 19:53:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:37.360920 | orchestrator | 2025-09-23 19:53:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:37.363218 | orchestrator | 2025-09-23 19:53:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:37.363270 | orchestrator | 2025-09-23 19:53:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:40.401018 | orchestrator | 2025-09-23 19:53:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:40.403421 | orchestrator | 2025-09-23 19:53:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:40.403765 | orchestrator | 2025-09-23 19:53:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:43.448038 | orchestrator | 2025-09-23 19:53:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:43.450299 | orchestrator | 2025-09-23 19:53:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:43.450538 | orchestrator | 2025-09-23 19:53:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:46.496599 | orchestrator | 2025-09-23 19:53:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:46.498615 | orchestrator | 2025-09-23 19:53:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:46.498666 | orchestrator | 2025-09-23 19:53:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:49.547433 | orchestrator | 2025-09-23 19:53:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:49.549105 | orchestrator | 2025-09-23 19:53:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:49.549326 | orchestrator | 2025-09-23 19:53:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:52.589746 | orchestrator | 2025-09-23 19:53:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:52.591909 | orchestrator | 2025-09-23 19:53:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:52.591959 | orchestrator | 2025-09-23 19:53:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:55.637027 | orchestrator | 2025-09-23 19:53:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:55.638546 | orchestrator | 2025-09-23 19:53:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:55.638630 | orchestrator | 2025-09-23 19:53:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:53:58.682314 | orchestrator | 2025-09-23 19:53:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:53:58.683899 | orchestrator | 2025-09-23 19:53:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:53:58.683976 | orchestrator | 2025-09-23 19:53:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:01.733243 | orchestrator | 2025-09-23 19:54:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:01.734656 | orchestrator | 2025-09-23 19:54:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:01.734703 | orchestrator | 2025-09-23 19:54:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:04.784200 | orchestrator | 2025-09-23 19:54:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:04.785783 | orchestrator | 2025-09-23 19:54:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:04.785799 | orchestrator | 2025-09-23 19:54:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:07.835359 | orchestrator | 2025-09-23 19:54:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:07.837479 | orchestrator | 2025-09-23 19:54:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:07.837536 | orchestrator | 2025-09-23 19:54:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:10.882769 | orchestrator | 2025-09-23 19:54:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:10.883642 | orchestrator | 2025-09-23 19:54:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:10.883723 | orchestrator | 2025-09-23 19:54:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:13.925945 | orchestrator | 2025-09-23 19:54:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:13.927017 | orchestrator | 2025-09-23 19:54:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:13.927088 | orchestrator | 2025-09-23 19:54:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:16.972007 | orchestrator | 2025-09-23 19:54:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:16.973925 | orchestrator | 2025-09-23 19:54:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:16.974004 | orchestrator | 2025-09-23 19:54:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:20.025692 | orchestrator | 2025-09-23 19:54:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:20.027202 | orchestrator | 2025-09-23 19:54:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:20.027233 | orchestrator | 2025-09-23 19:54:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:23.075573 | orchestrator | 2025-09-23 19:54:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:23.076810 | orchestrator | 2025-09-23 19:54:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:23.076837 | orchestrator | 2025-09-23 19:54:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:26.122564 | orchestrator | 2025-09-23 19:54:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:26.124416 | orchestrator | 2025-09-23 19:54:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:26.124462 | orchestrator | 2025-09-23 19:54:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:29.164767 | orchestrator | 2025-09-23 19:54:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:29.166132 | orchestrator | 2025-09-23 19:54:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:29.166215 | orchestrator | 2025-09-23 19:54:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:32.212907 | orchestrator | 2025-09-23 19:54:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:32.214314 | orchestrator | 2025-09-23 19:54:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:32.214345 | orchestrator | 2025-09-23 19:54:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:35.254702 | orchestrator | 2025-09-23 19:54:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:35.256476 | orchestrator | 2025-09-23 19:54:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:35.256504 | orchestrator | 2025-09-23 19:54:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:38.300613 | orchestrator | 2025-09-23 19:54:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:38.303124 | orchestrator | 2025-09-23 19:54:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:38.303184 | orchestrator | 2025-09-23 19:54:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:41.352459 | orchestrator | 2025-09-23 19:54:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:41.353056 | orchestrator | 2025-09-23 19:54:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:41.353087 | orchestrator | 2025-09-23 19:54:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:44.407366 | orchestrator | 2025-09-23 19:54:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:44.408369 | orchestrator | 2025-09-23 19:54:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:44.408400 | orchestrator | 2025-09-23 19:54:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:47.452111 | orchestrator | 2025-09-23 19:54:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:47.453189 | orchestrator | 2025-09-23 19:54:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:47.453353 | orchestrator | 2025-09-23 19:54:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:50.503092 | orchestrator | 2025-09-23 19:54:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:50.505700 | orchestrator | 2025-09-23 19:54:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:50.505715 | orchestrator | 2025-09-23 19:54:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:53.551933 | orchestrator | 2025-09-23 19:54:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:53.554059 | orchestrator | 2025-09-23 19:54:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:53.554207 | orchestrator | 2025-09-23 19:54:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:56.601012 | orchestrator | 2025-09-23 19:54:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:56.602855 | orchestrator | 2025-09-23 19:54:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:56.602891 | orchestrator | 2025-09-23 19:54:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:54:59.650552 | orchestrator | 2025-09-23 19:54:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:54:59.650621 | orchestrator | 2025-09-23 19:54:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:54:59.650628 | orchestrator | 2025-09-23 19:54:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:02.693699 | orchestrator | 2025-09-23 19:55:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:02.695351 | orchestrator | 2025-09-23 19:55:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:02.695398 | orchestrator | 2025-09-23 19:55:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:05.739652 | orchestrator | 2025-09-23 19:55:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:05.741504 | orchestrator | 2025-09-23 19:55:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:05.741580 | orchestrator | 2025-09-23 19:55:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:08.791132 | orchestrator | 2025-09-23 19:55:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:08.791625 | orchestrator | 2025-09-23 19:55:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:08.791734 | orchestrator | 2025-09-23 19:55:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:11.828119 | orchestrator | 2025-09-23 19:55:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:11.829592 | orchestrator | 2025-09-23 19:55:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:11.829710 | orchestrator | 2025-09-23 19:55:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:14.868877 | orchestrator | 2025-09-23 19:55:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:14.870773 | orchestrator | 2025-09-23 19:55:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:14.870959 | orchestrator | 2025-09-23 19:55:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:17.912855 | orchestrator | 2025-09-23 19:55:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:17.915441 | orchestrator | 2025-09-23 19:55:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:17.915658 | orchestrator | 2025-09-23 19:55:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:20.954280 | orchestrator | 2025-09-23 19:55:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:20.955627 | orchestrator | 2025-09-23 19:55:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:20.955657 | orchestrator | 2025-09-23 19:55:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:23.995049 | orchestrator | 2025-09-23 19:55:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:23.996441 | orchestrator | 2025-09-23 19:55:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:23.996592 | orchestrator | 2025-09-23 19:55:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:27.033073 | orchestrator | 2025-09-23 19:55:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:27.034746 | orchestrator | 2025-09-23 19:55:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:27.034895 | orchestrator | 2025-09-23 19:55:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:30.080568 | orchestrator | 2025-09-23 19:55:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:30.081752 | orchestrator | 2025-09-23 19:55:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:30.081871 | orchestrator | 2025-09-23 19:55:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:33.131980 | orchestrator | 2025-09-23 19:55:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:33.133756 | orchestrator | 2025-09-23 19:55:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:33.133851 | orchestrator | 2025-09-23 19:55:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:36.186219 | orchestrator | 2025-09-23 19:55:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:36.188861 | orchestrator | 2025-09-23 19:55:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:36.188956 | orchestrator | 2025-09-23 19:55:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:39.232586 | orchestrator | 2025-09-23 19:55:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:39.232957 | orchestrator | 2025-09-23 19:55:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:39.232988 | orchestrator | 2025-09-23 19:55:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:42.280720 | orchestrator | 2025-09-23 19:55:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:42.282357 | orchestrator | 2025-09-23 19:55:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:42.282596 | orchestrator | 2025-09-23 19:55:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:45.329867 | orchestrator | 2025-09-23 19:55:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:45.331369 | orchestrator | 2025-09-23 19:55:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:45.331619 | orchestrator | 2025-09-23 19:55:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:48.375306 | orchestrator | 2025-09-23 19:55:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:48.377744 | orchestrator | 2025-09-23 19:55:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:48.377782 | orchestrator | 2025-09-23 19:55:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:51.426717 | orchestrator | 2025-09-23 19:55:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:51.429045 | orchestrator | 2025-09-23 19:55:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:51.429087 | orchestrator | 2025-09-23 19:55:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:54.473424 | orchestrator | 2025-09-23 19:55:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:54.474789 | orchestrator | 2025-09-23 19:55:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:54.475067 | orchestrator | 2025-09-23 19:55:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:55:57.520647 | orchestrator | 2025-09-23 19:55:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:55:57.521997 | orchestrator | 2025-09-23 19:55:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:55:57.522084 | orchestrator | 2025-09-23 19:55:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:00.565706 | orchestrator | 2025-09-23 19:56:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:00.567653 | orchestrator | 2025-09-23 19:56:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:00.567689 | orchestrator | 2025-09-23 19:56:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:03.614374 | orchestrator | 2025-09-23 19:56:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:03.615565 | orchestrator | 2025-09-23 19:56:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:03.615679 | orchestrator | 2025-09-23 19:56:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:06.660232 | orchestrator | 2025-09-23 19:56:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:06.661452 | orchestrator | 2025-09-23 19:56:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:06.661480 | orchestrator | 2025-09-23 19:56:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:09.705853 | orchestrator | 2025-09-23 19:56:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:09.708215 | orchestrator | 2025-09-23 19:56:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:09.708319 | orchestrator | 2025-09-23 19:56:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:12.754081 | orchestrator | 2025-09-23 19:56:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:12.754731 | orchestrator | 2025-09-23 19:56:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:12.754805 | orchestrator | 2025-09-23 19:56:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:15.804100 | orchestrator | 2025-09-23 19:56:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:15.805604 | orchestrator | 2025-09-23 19:56:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:15.805643 | orchestrator | 2025-09-23 19:56:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:18.849727 | orchestrator | 2025-09-23 19:56:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:18.852518 | orchestrator | 2025-09-23 19:56:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:18.852548 | orchestrator | 2025-09-23 19:56:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:21.895337 | orchestrator | 2025-09-23 19:56:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:21.896977 | orchestrator | 2025-09-23 19:56:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:21.897010 | orchestrator | 2025-09-23 19:56:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:24.939877 | orchestrator | 2025-09-23 19:56:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:24.942979 | orchestrator | 2025-09-23 19:56:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:24.943059 | orchestrator | 2025-09-23 19:56:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:27.983397 | orchestrator | 2025-09-23 19:56:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:27.985547 | orchestrator | 2025-09-23 19:56:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:27.985578 | orchestrator | 2025-09-23 19:56:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:31.036448 | orchestrator | 2025-09-23 19:56:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:31.038783 | orchestrator | 2025-09-23 19:56:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:31.038837 | orchestrator | 2025-09-23 19:56:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:34.087579 | orchestrator | 2025-09-23 19:56:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:34.088337 | orchestrator | 2025-09-23 19:56:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:34.088433 | orchestrator | 2025-09-23 19:56:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:37.136624 | orchestrator | 2025-09-23 19:56:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:37.137924 | orchestrator | 2025-09-23 19:56:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:37.138173 | orchestrator | 2025-09-23 19:56:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:40.185523 | orchestrator | 2025-09-23 19:56:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:40.186967 | orchestrator | 2025-09-23 19:56:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:40.186987 | orchestrator | 2025-09-23 19:56:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:43.232611 | orchestrator | 2025-09-23 19:56:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:43.234119 | orchestrator | 2025-09-23 19:56:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:43.234267 | orchestrator | 2025-09-23 19:56:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:46.277517 | orchestrator | 2025-09-23 19:56:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:46.279749 | orchestrator | 2025-09-23 19:56:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:46.279869 | orchestrator | 2025-09-23 19:56:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:49.329513 | orchestrator | 2025-09-23 19:56:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:49.330004 | orchestrator | 2025-09-23 19:56:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:49.330154 | orchestrator | 2025-09-23 19:56:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:52.395915 | orchestrator | 2025-09-23 19:56:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:52.398667 | orchestrator | 2025-09-23 19:56:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:52.398702 | orchestrator | 2025-09-23 19:56:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:55.453416 | orchestrator | 2025-09-23 19:56:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:55.454157 | orchestrator | 2025-09-23 19:56:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:55.454192 | orchestrator | 2025-09-23 19:56:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:56:58.506300 | orchestrator | 2025-09-23 19:56:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:56:58.507969 | orchestrator | 2025-09-23 19:56:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:56:58.508096 | orchestrator | 2025-09-23 19:56:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:01.553821 | orchestrator | 2025-09-23 19:57:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:01.555728 | orchestrator | 2025-09-23 19:57:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:01.555787 | orchestrator | 2025-09-23 19:57:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:04.598517 | orchestrator | 2025-09-23 19:57:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:04.600390 | orchestrator | 2025-09-23 19:57:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:04.600614 | orchestrator | 2025-09-23 19:57:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:07.642884 | orchestrator | 2025-09-23 19:57:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:07.645558 | orchestrator | 2025-09-23 19:57:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:07.646227 | orchestrator | 2025-09-23 19:57:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:10.689563 | orchestrator | 2025-09-23 19:57:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:10.691871 | orchestrator | 2025-09-23 19:57:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:10.691940 | orchestrator | 2025-09-23 19:57:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:13.736391 | orchestrator | 2025-09-23 19:57:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:13.737559 | orchestrator | 2025-09-23 19:57:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:13.737589 | orchestrator | 2025-09-23 19:57:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:16.779916 | orchestrator | 2025-09-23 19:57:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:16.780781 | orchestrator | 2025-09-23 19:57:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:16.780814 | orchestrator | 2025-09-23 19:57:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:19.824276 | orchestrator | 2025-09-23 19:57:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:19.826234 | orchestrator | 2025-09-23 19:57:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:19.826267 | orchestrator | 2025-09-23 19:57:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:22.870188 | orchestrator | 2025-09-23 19:57:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:22.872140 | orchestrator | 2025-09-23 19:57:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:22.872187 | orchestrator | 2025-09-23 19:57:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:25.917235 | orchestrator | 2025-09-23 19:57:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:25.919720 | orchestrator | 2025-09-23 19:57:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:25.919758 | orchestrator | 2025-09-23 19:57:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:28.957201 | orchestrator | 2025-09-23 19:57:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:28.958759 | orchestrator | 2025-09-23 19:57:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:28.958967 | orchestrator | 2025-09-23 19:57:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:32.004464 | orchestrator | 2025-09-23 19:57:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:32.005896 | orchestrator | 2025-09-23 19:57:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:32.005925 | orchestrator | 2025-09-23 19:57:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:35.055815 | orchestrator | 2025-09-23 19:57:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:35.057306 | orchestrator | 2025-09-23 19:57:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:35.057342 | orchestrator | 2025-09-23 19:57:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:38.106272 | orchestrator | 2025-09-23 19:57:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:38.107722 | orchestrator | 2025-09-23 19:57:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:38.107751 | orchestrator | 2025-09-23 19:57:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:41.155198 | orchestrator | 2025-09-23 19:57:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:41.157166 | orchestrator | 2025-09-23 19:57:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:41.157198 | orchestrator | 2025-09-23 19:57:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:44.200927 | orchestrator | 2025-09-23 19:57:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:44.203548 | orchestrator | 2025-09-23 19:57:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:44.203599 | orchestrator | 2025-09-23 19:57:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:47.252381 | orchestrator | 2025-09-23 19:57:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:47.253541 | orchestrator | 2025-09-23 19:57:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:47.253571 | orchestrator | 2025-09-23 19:57:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:50.303054 | orchestrator | 2025-09-23 19:57:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:50.304553 | orchestrator | 2025-09-23 19:57:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:50.304691 | orchestrator | 2025-09-23 19:57:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:53.353650 | orchestrator | 2025-09-23 19:57:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:53.355650 | orchestrator | 2025-09-23 19:57:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:53.355691 | orchestrator | 2025-09-23 19:57:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:56.401666 | orchestrator | 2025-09-23 19:57:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:56.403424 | orchestrator | 2025-09-23 19:57:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:56.403459 | orchestrator | 2025-09-23 19:57:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:57:59.448595 | orchestrator | 2025-09-23 19:57:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:57:59.450342 | orchestrator | 2025-09-23 19:57:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:57:59.450386 | orchestrator | 2025-09-23 19:57:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:02.497456 | orchestrator | 2025-09-23 19:58:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:02.498387 | orchestrator | 2025-09-23 19:58:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:02.498419 | orchestrator | 2025-09-23 19:58:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:05.545590 | orchestrator | 2025-09-23 19:58:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:05.546282 | orchestrator | 2025-09-23 19:58:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:05.546402 | orchestrator | 2025-09-23 19:58:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:08.591244 | orchestrator | 2025-09-23 19:58:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:08.592268 | orchestrator | 2025-09-23 19:58:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:08.592297 | orchestrator | 2025-09-23 19:58:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:11.640021 | orchestrator | 2025-09-23 19:58:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:11.640668 | orchestrator | 2025-09-23 19:58:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:11.640950 | orchestrator | 2025-09-23 19:58:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:14.689292 | orchestrator | 2025-09-23 19:58:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:14.690454 | orchestrator | 2025-09-23 19:58:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:14.690533 | orchestrator | 2025-09-23 19:58:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:17.737931 | orchestrator | 2025-09-23 19:58:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:17.739007 | orchestrator | 2025-09-23 19:58:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:17.739037 | orchestrator | 2025-09-23 19:58:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:20.786152 | orchestrator | 2025-09-23 19:58:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:20.787593 | orchestrator | 2025-09-23 19:58:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:20.787749 | orchestrator | 2025-09-23 19:58:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:23.836458 | orchestrator | 2025-09-23 19:58:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:23.837758 | orchestrator | 2025-09-23 19:58:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:23.837797 | orchestrator | 2025-09-23 19:58:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:26.873640 | orchestrator | 2025-09-23 19:58:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:26.874682 | orchestrator | 2025-09-23 19:58:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:26.874736 | orchestrator | 2025-09-23 19:58:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:29.924235 | orchestrator | 2025-09-23 19:58:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:29.926420 | orchestrator | 2025-09-23 19:58:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:29.926463 | orchestrator | 2025-09-23 19:58:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:32.978468 | orchestrator | 2025-09-23 19:58:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:32.979650 | orchestrator | 2025-09-23 19:58:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:32.979726 | orchestrator | 2025-09-23 19:58:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:36.024770 | orchestrator | 2025-09-23 19:58:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:36.027358 | orchestrator | 2025-09-23 19:58:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:36.027420 | orchestrator | 2025-09-23 19:58:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:39.073473 | orchestrator | 2025-09-23 19:58:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:39.075081 | orchestrator | 2025-09-23 19:58:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:39.075163 | orchestrator | 2025-09-23 19:58:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:42.125231 | orchestrator | 2025-09-23 19:58:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:42.127498 | orchestrator | 2025-09-23 19:58:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:42.127642 | orchestrator | 2025-09-23 19:58:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:45.171207 | orchestrator | 2025-09-23 19:58:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:45.171949 | orchestrator | 2025-09-23 19:58:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:45.171982 | orchestrator | 2025-09-23 19:58:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:48.216217 | orchestrator | 2025-09-23 19:58:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:48.217978 | orchestrator | 2025-09-23 19:58:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:48.218141 | orchestrator | 2025-09-23 19:58:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:51.259164 | orchestrator | 2025-09-23 19:58:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:51.260714 | orchestrator | 2025-09-23 19:58:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:51.260827 | orchestrator | 2025-09-23 19:58:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:54.308427 | orchestrator | 2025-09-23 19:58:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:54.311250 | orchestrator | 2025-09-23 19:58:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:54.311282 | orchestrator | 2025-09-23 19:58:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:58:57.350174 | orchestrator | 2025-09-23 19:58:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:58:57.351810 | orchestrator | 2025-09-23 19:58:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:58:57.351842 | orchestrator | 2025-09-23 19:58:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:00.398678 | orchestrator | 2025-09-23 19:59:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:00.399826 | orchestrator | 2025-09-23 19:59:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:00.400486 | orchestrator | 2025-09-23 19:59:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:03.449232 | orchestrator | 2025-09-23 19:59:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:03.450456 | orchestrator | 2025-09-23 19:59:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:03.450486 | orchestrator | 2025-09-23 19:59:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:06.495596 | orchestrator | 2025-09-23 19:59:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:06.498441 | orchestrator | 2025-09-23 19:59:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:06.498496 | orchestrator | 2025-09-23 19:59:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:09.545769 | orchestrator | 2025-09-23 19:59:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:09.547621 | orchestrator | 2025-09-23 19:59:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:09.547657 | orchestrator | 2025-09-23 19:59:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:12.588933 | orchestrator | 2025-09-23 19:59:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:12.590168 | orchestrator | 2025-09-23 19:59:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:12.590207 | orchestrator | 2025-09-23 19:59:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:15.639649 | orchestrator | 2025-09-23 19:59:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:15.641195 | orchestrator | 2025-09-23 19:59:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:15.641233 | orchestrator | 2025-09-23 19:59:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:18.690486 | orchestrator | 2025-09-23 19:59:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:18.692563 | orchestrator | 2025-09-23 19:59:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:18.692651 | orchestrator | 2025-09-23 19:59:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:21.744785 | orchestrator | 2025-09-23 19:59:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:21.747388 | orchestrator | 2025-09-23 19:59:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:21.747471 | orchestrator | 2025-09-23 19:59:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:24.795796 | orchestrator | 2025-09-23 19:59:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:24.797601 | orchestrator | 2025-09-23 19:59:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:24.797737 | orchestrator | 2025-09-23 19:59:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:27.847965 | orchestrator | 2025-09-23 19:59:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:27.850515 | orchestrator | 2025-09-23 19:59:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:27.850638 | orchestrator | 2025-09-23 19:59:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:30.895516 | orchestrator | 2025-09-23 19:59:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:30.896630 | orchestrator | 2025-09-23 19:59:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:30.896835 | orchestrator | 2025-09-23 19:59:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:33.944657 | orchestrator | 2025-09-23 19:59:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:33.946911 | orchestrator | 2025-09-23 19:59:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:33.946953 | orchestrator | 2025-09-23 19:59:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:36.993922 | orchestrator | 2025-09-23 19:59:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:36.996629 | orchestrator | 2025-09-23 19:59:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:36.996684 | orchestrator | 2025-09-23 19:59:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:40.036366 | orchestrator | 2025-09-23 19:59:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:40.037369 | orchestrator | 2025-09-23 19:59:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:40.037430 | orchestrator | 2025-09-23 19:59:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:43.078267 | orchestrator | 2025-09-23 19:59:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:43.078890 | orchestrator | 2025-09-23 19:59:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:43.079232 | orchestrator | 2025-09-23 19:59:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:46.124296 | orchestrator | 2025-09-23 19:59:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:46.125304 | orchestrator | 2025-09-23 19:59:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:46.125335 | orchestrator | 2025-09-23 19:59:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:49.172719 | orchestrator | 2025-09-23 19:59:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:49.174710 | orchestrator | 2025-09-23 19:59:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:49.174792 | orchestrator | 2025-09-23 19:59:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:52.210851 | orchestrator | 2025-09-23 19:59:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:52.211879 | orchestrator | 2025-09-23 19:59:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:52.211910 | orchestrator | 2025-09-23 19:59:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:55.253511 | orchestrator | 2025-09-23 19:59:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:55.255675 | orchestrator | 2025-09-23 19:59:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:55.255911 | orchestrator | 2025-09-23 19:59:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 19:59:58.294825 | orchestrator | 2025-09-23 19:59:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 19:59:58.296375 | orchestrator | 2025-09-23 19:59:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 19:59:58.296495 | orchestrator | 2025-09-23 19:59:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:01.330337 | orchestrator | 2025-09-23 20:00:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:01.330868 | orchestrator | 2025-09-23 20:00:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:01.330904 | orchestrator | 2025-09-23 20:00:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:04.379429 | orchestrator | 2025-09-23 20:00:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:04.380868 | orchestrator | 2025-09-23 20:00:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:04.380955 | orchestrator | 2025-09-23 20:00:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:07.419660 | orchestrator | 2025-09-23 20:00:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:07.421571 | orchestrator | 2025-09-23 20:00:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:07.421681 | orchestrator | 2025-09-23 20:00:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:10.462571 | orchestrator | 2025-09-23 20:00:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:10.463912 | orchestrator | 2025-09-23 20:00:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:10.464454 | orchestrator | 2025-09-23 20:00:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:13.508303 | orchestrator | 2025-09-23 20:00:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:13.509747 | orchestrator | 2025-09-23 20:00:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:13.509779 | orchestrator | 2025-09-23 20:00:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:16.552901 | orchestrator | 2025-09-23 20:00:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:16.554916 | orchestrator | 2025-09-23 20:00:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:16.554950 | orchestrator | 2025-09-23 20:00:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:19.594532 | orchestrator | 2025-09-23 20:00:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:19.595891 | orchestrator | 2025-09-23 20:00:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:19.595924 | orchestrator | 2025-09-23 20:00:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:22.641648 | orchestrator | 2025-09-23 20:00:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:22.643197 | orchestrator | 2025-09-23 20:00:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:22.643279 | orchestrator | 2025-09-23 20:00:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:25.684836 | orchestrator | 2025-09-23 20:00:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:25.686877 | orchestrator | 2025-09-23 20:00:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:25.686989 | orchestrator | 2025-09-23 20:00:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:28.727676 | orchestrator | 2025-09-23 20:00:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:28.729215 | orchestrator | 2025-09-23 20:00:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:28.729251 | orchestrator | 2025-09-23 20:00:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:31.775454 | orchestrator | 2025-09-23 20:00:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:31.777055 | orchestrator | 2025-09-23 20:00:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:31.777189 | orchestrator | 2025-09-23 20:00:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:34.821450 | orchestrator | 2025-09-23 20:00:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:34.821776 | orchestrator | 2025-09-23 20:00:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:34.821821 | orchestrator | 2025-09-23 20:00:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:37.867063 | orchestrator | 2025-09-23 20:00:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:37.869357 | orchestrator | 2025-09-23 20:00:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:37.869390 | orchestrator | 2025-09-23 20:00:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:40.912467 | orchestrator | 2025-09-23 20:00:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:40.913733 | orchestrator | 2025-09-23 20:00:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:40.913767 | orchestrator | 2025-09-23 20:00:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:43.959400 | orchestrator | 2025-09-23 20:00:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:43.960703 | orchestrator | 2025-09-23 20:00:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:43.961126 | orchestrator | 2025-09-23 20:00:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:47.007832 | orchestrator | 2025-09-23 20:00:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:47.008751 | orchestrator | 2025-09-23 20:00:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:47.008805 | orchestrator | 2025-09-23 20:00:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:50.058935 | orchestrator | 2025-09-23 20:00:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:50.059590 | orchestrator | 2025-09-23 20:00:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:50.059628 | orchestrator | 2025-09-23 20:00:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:53.103450 | orchestrator | 2025-09-23 20:00:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:53.105745 | orchestrator | 2025-09-23 20:00:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:53.105778 | orchestrator | 2025-09-23 20:00:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:56.146210 | orchestrator | 2025-09-23 20:00:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:56.147519 | orchestrator | 2025-09-23 20:00:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:56.147551 | orchestrator | 2025-09-23 20:00:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:00:59.190946 | orchestrator | 2025-09-23 20:00:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:00:59.192201 | orchestrator | 2025-09-23 20:00:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:00:59.192386 | orchestrator | 2025-09-23 20:00:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:02.240938 | orchestrator | 2025-09-23 20:01:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:02.242589 | orchestrator | 2025-09-23 20:01:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:02.242631 | orchestrator | 2025-09-23 20:01:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:05.285307 | orchestrator | 2025-09-23 20:01:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:05.286982 | orchestrator | 2025-09-23 20:01:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:05.287198 | orchestrator | 2025-09-23 20:01:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:08.335781 | orchestrator | 2025-09-23 20:01:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:08.338305 | orchestrator | 2025-09-23 20:01:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:08.338385 | orchestrator | 2025-09-23 20:01:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:11.382370 | orchestrator | 2025-09-23 20:01:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:11.383823 | orchestrator | 2025-09-23 20:01:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:11.384123 | orchestrator | 2025-09-23 20:01:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:14.426455 | orchestrator | 2025-09-23 20:01:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:14.429601 | orchestrator | 2025-09-23 20:01:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:14.429656 | orchestrator | 2025-09-23 20:01:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:17.466401 | orchestrator | 2025-09-23 20:01:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:17.467668 | orchestrator | 2025-09-23 20:01:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:17.468185 | orchestrator | 2025-09-23 20:01:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:20.517942 | orchestrator | 2025-09-23 20:01:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:20.520188 | orchestrator | 2025-09-23 20:01:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:20.520296 | orchestrator | 2025-09-23 20:01:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:23.565331 | orchestrator | 2025-09-23 20:01:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:23.566277 | orchestrator | 2025-09-23 20:01:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:23.566309 | orchestrator | 2025-09-23 20:01:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:26.612255 | orchestrator | 2025-09-23 20:01:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:26.613362 | orchestrator | 2025-09-23 20:01:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:26.613395 | orchestrator | 2025-09-23 20:01:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:29.656356 | orchestrator | 2025-09-23 20:01:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:29.658507 | orchestrator | 2025-09-23 20:01:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:29.658551 | orchestrator | 2025-09-23 20:01:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:32.709726 | orchestrator | 2025-09-23 20:01:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:32.710548 | orchestrator | 2025-09-23 20:01:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:32.710577 | orchestrator | 2025-09-23 20:01:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:35.759385 | orchestrator | 2025-09-23 20:01:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:35.760897 | orchestrator | 2025-09-23 20:01:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:35.761202 | orchestrator | 2025-09-23 20:01:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:38.804501 | orchestrator | 2025-09-23 20:01:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:38.805426 | orchestrator | 2025-09-23 20:01:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:38.805507 | orchestrator | 2025-09-23 20:01:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:41.853839 | orchestrator | 2025-09-23 20:01:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:41.856324 | orchestrator | 2025-09-23 20:01:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:41.856361 | orchestrator | 2025-09-23 20:01:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:44.900715 | orchestrator | 2025-09-23 20:01:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:44.902696 | orchestrator | 2025-09-23 20:01:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:44.902729 | orchestrator | 2025-09-23 20:01:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:47.951831 | orchestrator | 2025-09-23 20:01:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:47.953850 | orchestrator | 2025-09-23 20:01:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:47.954453 | orchestrator | 2025-09-23 20:01:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:50.998833 | orchestrator | 2025-09-23 20:01:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:51.003383 | orchestrator | 2025-09-23 20:01:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:51.003460 | orchestrator | 2025-09-23 20:01:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:54.047776 | orchestrator | 2025-09-23 20:01:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:54.049979 | orchestrator | 2025-09-23 20:01:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:54.050071 | orchestrator | 2025-09-23 20:01:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:01:57.096002 | orchestrator | 2025-09-23 20:01:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:01:57.097422 | orchestrator | 2025-09-23 20:01:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:01:57.097461 | orchestrator | 2025-09-23 20:01:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:00.138922 | orchestrator | 2025-09-23 20:02:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:00.139270 | orchestrator | 2025-09-23 20:02:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:00.139308 | orchestrator | 2025-09-23 20:02:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:03.184874 | orchestrator | 2025-09-23 20:02:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:03.186638 | orchestrator | 2025-09-23 20:02:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:03.186691 | orchestrator | 2025-09-23 20:02:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:06.228867 | orchestrator | 2025-09-23 20:02:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:06.231169 | orchestrator | 2025-09-23 20:02:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:06.231319 | orchestrator | 2025-09-23 20:02:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:09.270944 | orchestrator | 2025-09-23 20:02:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:09.273049 | orchestrator | 2025-09-23 20:02:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:09.273082 | orchestrator | 2025-09-23 20:02:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:12.322304 | orchestrator | 2025-09-23 20:02:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:12.324232 | orchestrator | 2025-09-23 20:02:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:12.324489 | orchestrator | 2025-09-23 20:02:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:15.366540 | orchestrator | 2025-09-23 20:02:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:15.367498 | orchestrator | 2025-09-23 20:02:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:15.367534 | orchestrator | 2025-09-23 20:02:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:18.417594 | orchestrator | 2025-09-23 20:02:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:18.418568 | orchestrator | 2025-09-23 20:02:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:18.418606 | orchestrator | 2025-09-23 20:02:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:21.460927 | orchestrator | 2025-09-23 20:02:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:21.462292 | orchestrator | 2025-09-23 20:02:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:21.462327 | orchestrator | 2025-09-23 20:02:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:24.505241 | orchestrator | 2025-09-23 20:02:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:24.507106 | orchestrator | 2025-09-23 20:02:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:24.507306 | orchestrator | 2025-09-23 20:02:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:27.550291 | orchestrator | 2025-09-23 20:02:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:27.552109 | orchestrator | 2025-09-23 20:02:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:27.552142 | orchestrator | 2025-09-23 20:02:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:30.601319 | orchestrator | 2025-09-23 20:02:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:30.601936 | orchestrator | 2025-09-23 20:02:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:30.601976 | orchestrator | 2025-09-23 20:02:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:33.647625 | orchestrator | 2025-09-23 20:02:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:33.648508 | orchestrator | 2025-09-23 20:02:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:33.648567 | orchestrator | 2025-09-23 20:02:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:36.691525 | orchestrator | 2025-09-23 20:02:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:36.693141 | orchestrator | 2025-09-23 20:02:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:36.693204 | orchestrator | 2025-09-23 20:02:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:39.741023 | orchestrator | 2025-09-23 20:02:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:39.742941 | orchestrator | 2025-09-23 20:02:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:39.743018 | orchestrator | 2025-09-23 20:02:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:42.792073 | orchestrator | 2025-09-23 20:02:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:42.793762 | orchestrator | 2025-09-23 20:02:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:42.793796 | orchestrator | 2025-09-23 20:02:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:45.840586 | orchestrator | 2025-09-23 20:02:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:45.842785 | orchestrator | 2025-09-23 20:02:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:45.842810 | orchestrator | 2025-09-23 20:02:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:48.887204 | orchestrator | 2025-09-23 20:02:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:48.888101 | orchestrator | 2025-09-23 20:02:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:48.888140 | orchestrator | 2025-09-23 20:02:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:51.935136 | orchestrator | 2025-09-23 20:02:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:51.937020 | orchestrator | 2025-09-23 20:02:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:51.937101 | orchestrator | 2025-09-23 20:02:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:54.979804 | orchestrator | 2025-09-23 20:02:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:54.982015 | orchestrator | 2025-09-23 20:02:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:54.982340 | orchestrator | 2025-09-23 20:02:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:02:58.023739 | orchestrator | 2025-09-23 20:02:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:02:58.024987 | orchestrator | 2025-09-23 20:02:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:02:58.025092 | orchestrator | 2025-09-23 20:02:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:01.074199 | orchestrator | 2025-09-23 20:03:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:01.075273 | orchestrator | 2025-09-23 20:03:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:01.075384 | orchestrator | 2025-09-23 20:03:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:04.128783 | orchestrator | 2025-09-23 20:03:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:04.130128 | orchestrator | 2025-09-23 20:03:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:04.130217 | orchestrator | 2025-09-23 20:03:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:07.178392 | orchestrator | 2025-09-23 20:03:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:07.181642 | orchestrator | 2025-09-23 20:03:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:07.181778 | orchestrator | 2025-09-23 20:03:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:10.230492 | orchestrator | 2025-09-23 20:03:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:10.231285 | orchestrator | 2025-09-23 20:03:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:10.231901 | orchestrator | 2025-09-23 20:03:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:13.281664 | orchestrator | 2025-09-23 20:03:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:13.282903 | orchestrator | 2025-09-23 20:03:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:13.282938 | orchestrator | 2025-09-23 20:03:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:16.322435 | orchestrator | 2025-09-23 20:03:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:16.325517 | orchestrator | 2025-09-23 20:03:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:16.325552 | orchestrator | 2025-09-23 20:03:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:19.375731 | orchestrator | 2025-09-23 20:03:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:19.378397 | orchestrator | 2025-09-23 20:03:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:19.378436 | orchestrator | 2025-09-23 20:03:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:22.424002 | orchestrator | 2025-09-23 20:03:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:22.424754 | orchestrator | 2025-09-23 20:03:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:22.424784 | orchestrator | 2025-09-23 20:03:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:25.468449 | orchestrator | 2025-09-23 20:03:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:25.469486 | orchestrator | 2025-09-23 20:03:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:25.469528 | orchestrator | 2025-09-23 20:03:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:28.520181 | orchestrator | 2025-09-23 20:03:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:28.522000 | orchestrator | 2025-09-23 20:03:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:28.522110 | orchestrator | 2025-09-23 20:03:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:31.570834 | orchestrator | 2025-09-23 20:03:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:31.571814 | orchestrator | 2025-09-23 20:03:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:31.571923 | orchestrator | 2025-09-23 20:03:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:34.617471 | orchestrator | 2025-09-23 20:03:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:34.618929 | orchestrator | 2025-09-23 20:03:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:34.618973 | orchestrator | 2025-09-23 20:03:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:37.670596 | orchestrator | 2025-09-23 20:03:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:37.672421 | orchestrator | 2025-09-23 20:03:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:37.672724 | orchestrator | 2025-09-23 20:03:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:40.716258 | orchestrator | 2025-09-23 20:03:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:40.717509 | orchestrator | 2025-09-23 20:03:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:40.717539 | orchestrator | 2025-09-23 20:03:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:43.757457 | orchestrator | 2025-09-23 20:03:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:43.759098 | orchestrator | 2025-09-23 20:03:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:43.759141 | orchestrator | 2025-09-23 20:03:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:46.804592 | orchestrator | 2025-09-23 20:03:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:46.808963 | orchestrator | 2025-09-23 20:03:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:46.810061 | orchestrator | 2025-09-23 20:03:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:49.860634 | orchestrator | 2025-09-23 20:03:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:49.861884 | orchestrator | 2025-09-23 20:03:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:49.862085 | orchestrator | 2025-09-23 20:03:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:52.906645 | orchestrator | 2025-09-23 20:03:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:52.908254 | orchestrator | 2025-09-23 20:03:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:52.908329 | orchestrator | 2025-09-23 20:03:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:55.952338 | orchestrator | 2025-09-23 20:03:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:55.954185 | orchestrator | 2025-09-23 20:03:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:55.954223 | orchestrator | 2025-09-23 20:03:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:03:58.998535 | orchestrator | 2025-09-23 20:03:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:03:59.001388 | orchestrator | 2025-09-23 20:03:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:03:59.001506 | orchestrator | 2025-09-23 20:03:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:02.043689 | orchestrator | 2025-09-23 20:04:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:02.045242 | orchestrator | 2025-09-23 20:04:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:02.045292 | orchestrator | 2025-09-23 20:04:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:05.093063 | orchestrator | 2025-09-23 20:04:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:05.094565 | orchestrator | 2025-09-23 20:04:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:05.094598 | orchestrator | 2025-09-23 20:04:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:08.135612 | orchestrator | 2025-09-23 20:04:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:08.136959 | orchestrator | 2025-09-23 20:04:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:08.136994 | orchestrator | 2025-09-23 20:04:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:11.185973 | orchestrator | 2025-09-23 20:04:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:11.186277 | orchestrator | 2025-09-23 20:04:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:11.186303 | orchestrator | 2025-09-23 20:04:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:14.231297 | orchestrator | 2025-09-23 20:04:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:14.232923 | orchestrator | 2025-09-23 20:04:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:14.233018 | orchestrator | 2025-09-23 20:04:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:17.275498 | orchestrator | 2025-09-23 20:04:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:17.277101 | orchestrator | 2025-09-23 20:04:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:17.277135 | orchestrator | 2025-09-23 20:04:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:20.322839 | orchestrator | 2025-09-23 20:04:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:20.325264 | orchestrator | 2025-09-23 20:04:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:20.325325 | orchestrator | 2025-09-23 20:04:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:23.367494 | orchestrator | 2025-09-23 20:04:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:23.369351 | orchestrator | 2025-09-23 20:04:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:23.369430 | orchestrator | 2025-09-23 20:04:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:26.409254 | orchestrator | 2025-09-23 20:04:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:26.410849 | orchestrator | 2025-09-23 20:04:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:26.410892 | orchestrator | 2025-09-23 20:04:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:29.459791 | orchestrator | 2025-09-23 20:04:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:29.461244 | orchestrator | 2025-09-23 20:04:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:29.461280 | orchestrator | 2025-09-23 20:04:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:32.506334 | orchestrator | 2025-09-23 20:04:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:32.507966 | orchestrator | 2025-09-23 20:04:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:32.508069 | orchestrator | 2025-09-23 20:04:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:35.554175 | orchestrator | 2025-09-23 20:04:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:35.555218 | orchestrator | 2025-09-23 20:04:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:35.555283 | orchestrator | 2025-09-23 20:04:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:38.599015 | orchestrator | 2025-09-23 20:04:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:38.601090 | orchestrator | 2025-09-23 20:04:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:38.601122 | orchestrator | 2025-09-23 20:04:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:41.645977 | orchestrator | 2025-09-23 20:04:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:41.647019 | orchestrator | 2025-09-23 20:04:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:41.647054 | orchestrator | 2025-09-23 20:04:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:44.694754 | orchestrator | 2025-09-23 20:04:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:44.696028 | orchestrator | 2025-09-23 20:04:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:44.696059 | orchestrator | 2025-09-23 20:04:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:47.735659 | orchestrator | 2025-09-23 20:04:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:47.736769 | orchestrator | 2025-09-23 20:04:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:47.736959 | orchestrator | 2025-09-23 20:04:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:50.788675 | orchestrator | 2025-09-23 20:04:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:50.790006 | orchestrator | 2025-09-23 20:04:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:50.790139 | orchestrator | 2025-09-23 20:04:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:53.832947 | orchestrator | 2025-09-23 20:04:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:53.835106 | orchestrator | 2025-09-23 20:04:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:53.835164 | orchestrator | 2025-09-23 20:04:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:56.884115 | orchestrator | 2025-09-23 20:04:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:56.885342 | orchestrator | 2025-09-23 20:04:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:56.885377 | orchestrator | 2025-09-23 20:04:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:04:59.929584 | orchestrator | 2025-09-23 20:04:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:04:59.932011 | orchestrator | 2025-09-23 20:04:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:04:59.932725 | orchestrator | 2025-09-23 20:04:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:02.979367 | orchestrator | 2025-09-23 20:05:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:02.979705 | orchestrator | 2025-09-23 20:05:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:02.979734 | orchestrator | 2025-09-23 20:05:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:06.022742 | orchestrator | 2025-09-23 20:05:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:06.023705 | orchestrator | 2025-09-23 20:05:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:06.023821 | orchestrator | 2025-09-23 20:05:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:09.069563 | orchestrator | 2025-09-23 20:05:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:09.070980 | orchestrator | 2025-09-23 20:05:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:09.071012 | orchestrator | 2025-09-23 20:05:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:12.114447 | orchestrator | 2025-09-23 20:05:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:12.116272 | orchestrator | 2025-09-23 20:05:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:12.116413 | orchestrator | 2025-09-23 20:05:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:15.162814 | orchestrator | 2025-09-23 20:05:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:15.163945 | orchestrator | 2025-09-23 20:05:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:15.164132 | orchestrator | 2025-09-23 20:05:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:18.197904 | orchestrator | 2025-09-23 20:05:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:18.200987 | orchestrator | 2025-09-23 20:05:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:18.201028 | orchestrator | 2025-09-23 20:05:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:21.245480 | orchestrator | 2025-09-23 20:05:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:21.247910 | orchestrator | 2025-09-23 20:05:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:21.247943 | orchestrator | 2025-09-23 20:05:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:24.298488 | orchestrator | 2025-09-23 20:05:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:24.300977 | orchestrator | 2025-09-23 20:05:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:24.301156 | orchestrator | 2025-09-23 20:05:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:27.341345 | orchestrator | 2025-09-23 20:05:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:27.345423 | orchestrator | 2025-09-23 20:05:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:27.345467 | orchestrator | 2025-09-23 20:05:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:30.393576 | orchestrator | 2025-09-23 20:05:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:30.394743 | orchestrator | 2025-09-23 20:05:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:30.394785 | orchestrator | 2025-09-23 20:05:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:33.436575 | orchestrator | 2025-09-23 20:05:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:33.439283 | orchestrator | 2025-09-23 20:05:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:33.439315 | orchestrator | 2025-09-23 20:05:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:36.489848 | orchestrator | 2025-09-23 20:05:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:36.491175 | orchestrator | 2025-09-23 20:05:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:36.491270 | orchestrator | 2025-09-23 20:05:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:39.536061 | orchestrator | 2025-09-23 20:05:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:39.537577 | orchestrator | 2025-09-23 20:05:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:39.537609 | orchestrator | 2025-09-23 20:05:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:42.586843 | orchestrator | 2025-09-23 20:05:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:42.588587 | orchestrator | 2025-09-23 20:05:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:42.588635 | orchestrator | 2025-09-23 20:05:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:45.633930 | orchestrator | 2025-09-23 20:05:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:45.636273 | orchestrator | 2025-09-23 20:05:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:45.636298 | orchestrator | 2025-09-23 20:05:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:48.684433 | orchestrator | 2025-09-23 20:05:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:48.685630 | orchestrator | 2025-09-23 20:05:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:48.685735 | orchestrator | 2025-09-23 20:05:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:51.736141 | orchestrator | 2025-09-23 20:05:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:51.737205 | orchestrator | 2025-09-23 20:05:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:51.737237 | orchestrator | 2025-09-23 20:05:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:54.788021 | orchestrator | 2025-09-23 20:05:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:54.790087 | orchestrator | 2025-09-23 20:05:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:54.790211 | orchestrator | 2025-09-23 20:05:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:05:57.841128 | orchestrator | 2025-09-23 20:05:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:05:57.842564 | orchestrator | 2025-09-23 20:05:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:05:57.842603 | orchestrator | 2025-09-23 20:05:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:00.885287 | orchestrator | 2025-09-23 20:06:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:00.887075 | orchestrator | 2025-09-23 20:06:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:00.887144 | orchestrator | 2025-09-23 20:06:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:03.931282 | orchestrator | 2025-09-23 20:06:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:03.933124 | orchestrator | 2025-09-23 20:06:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:03.933456 | orchestrator | 2025-09-23 20:06:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:06.977720 | orchestrator | 2025-09-23 20:06:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:06.978855 | orchestrator | 2025-09-23 20:06:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:06.979134 | orchestrator | 2025-09-23 20:06:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:10.023278 | orchestrator | 2025-09-23 20:06:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:10.024584 | orchestrator | 2025-09-23 20:06:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:10.024735 | orchestrator | 2025-09-23 20:06:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:13.071387 | orchestrator | 2025-09-23 20:06:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:13.072712 | orchestrator | 2025-09-23 20:06:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:13.072792 | orchestrator | 2025-09-23 20:06:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:16.116394 | orchestrator | 2025-09-23 20:06:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:16.117085 | orchestrator | 2025-09-23 20:06:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:16.117120 | orchestrator | 2025-09-23 20:06:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:19.165051 | orchestrator | 2025-09-23 20:06:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:19.167134 | orchestrator | 2025-09-23 20:06:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:19.167937 | orchestrator | 2025-09-23 20:06:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:22.214334 | orchestrator | 2025-09-23 20:06:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:22.216426 | orchestrator | 2025-09-23 20:06:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:22.216455 | orchestrator | 2025-09-23 20:06:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:25.267156 | orchestrator | 2025-09-23 20:06:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:25.269365 | orchestrator | 2025-09-23 20:06:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:25.269396 | orchestrator | 2025-09-23 20:06:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:28.316450 | orchestrator | 2025-09-23 20:06:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:28.318259 | orchestrator | 2025-09-23 20:06:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:28.318377 | orchestrator | 2025-09-23 20:06:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:31.366968 | orchestrator | 2025-09-23 20:06:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:31.368302 | orchestrator | 2025-09-23 20:06:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:31.368384 | orchestrator | 2025-09-23 20:06:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:34.416430 | orchestrator | 2025-09-23 20:06:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:34.417306 | orchestrator | 2025-09-23 20:06:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:34.417656 | orchestrator | 2025-09-23 20:06:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:37.464339 | orchestrator | 2025-09-23 20:06:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:37.466281 | orchestrator | 2025-09-23 20:06:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:37.466324 | orchestrator | 2025-09-23 20:06:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:40.508224 | orchestrator | 2025-09-23 20:06:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:40.509796 | orchestrator | 2025-09-23 20:06:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:40.509879 | orchestrator | 2025-09-23 20:06:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:43.555423 | orchestrator | 2025-09-23 20:06:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:43.557077 | orchestrator | 2025-09-23 20:06:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:43.557119 | orchestrator | 2025-09-23 20:06:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:46.603502 | orchestrator | 2025-09-23 20:06:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:46.604860 | orchestrator | 2025-09-23 20:06:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:46.604946 | orchestrator | 2025-09-23 20:06:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:49.646577 | orchestrator | 2025-09-23 20:06:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:49.648509 | orchestrator | 2025-09-23 20:06:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:49.648889 | orchestrator | 2025-09-23 20:06:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:52.695567 | orchestrator | 2025-09-23 20:06:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:52.698164 | orchestrator | 2025-09-23 20:06:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:52.698372 | orchestrator | 2025-09-23 20:06:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:55.745007 | orchestrator | 2025-09-23 20:06:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:55.746447 | orchestrator | 2025-09-23 20:06:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:55.746515 | orchestrator | 2025-09-23 20:06:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:06:58.791114 | orchestrator | 2025-09-23 20:06:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:06:58.792621 | orchestrator | 2025-09-23 20:06:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:06:58.792652 | orchestrator | 2025-09-23 20:06:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:01.831935 | orchestrator | 2025-09-23 20:07:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:01.834212 | orchestrator | 2025-09-23 20:07:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:01.834509 | orchestrator | 2025-09-23 20:07:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:04.882993 | orchestrator | 2025-09-23 20:07:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:04.884230 | orchestrator | 2025-09-23 20:07:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:04.884261 | orchestrator | 2025-09-23 20:07:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:07.931888 | orchestrator | 2025-09-23 20:07:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:07.933449 | orchestrator | 2025-09-23 20:07:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:07.933558 | orchestrator | 2025-09-23 20:07:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:10.976216 | orchestrator | 2025-09-23 20:07:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:10.977063 | orchestrator | 2025-09-23 20:07:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:10.977100 | orchestrator | 2025-09-23 20:07:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:14.027393 | orchestrator | 2025-09-23 20:07:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:14.028531 | orchestrator | 2025-09-23 20:07:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:14.028566 | orchestrator | 2025-09-23 20:07:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:17.067766 | orchestrator | 2025-09-23 20:07:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:17.069293 | orchestrator | 2025-09-23 20:07:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:17.069326 | orchestrator | 2025-09-23 20:07:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:20.115995 | orchestrator | 2025-09-23 20:07:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:20.117492 | orchestrator | 2025-09-23 20:07:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:20.117513 | orchestrator | 2025-09-23 20:07:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:23.162756 | orchestrator | 2025-09-23 20:07:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:23.163967 | orchestrator | 2025-09-23 20:07:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:23.164025 | orchestrator | 2025-09-23 20:07:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:26.211669 | orchestrator | 2025-09-23 20:07:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:26.213135 | orchestrator | 2025-09-23 20:07:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:26.213170 | orchestrator | 2025-09-23 20:07:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:29.261042 | orchestrator | 2025-09-23 20:07:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:29.262723 | orchestrator | 2025-09-23 20:07:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:29.262774 | orchestrator | 2025-09-23 20:07:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:32.305008 | orchestrator | 2025-09-23 20:07:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:32.306369 | orchestrator | 2025-09-23 20:07:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:32.306428 | orchestrator | 2025-09-23 20:07:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:35.354643 | orchestrator | 2025-09-23 20:07:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:35.356046 | orchestrator | 2025-09-23 20:07:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:35.356123 | orchestrator | 2025-09-23 20:07:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:38.406317 | orchestrator | 2025-09-23 20:07:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:38.407417 | orchestrator | 2025-09-23 20:07:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:38.407449 | orchestrator | 2025-09-23 20:07:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:41.452957 | orchestrator | 2025-09-23 20:07:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:41.454648 | orchestrator | 2025-09-23 20:07:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:41.454926 | orchestrator | 2025-09-23 20:07:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:44.505601 | orchestrator | 2025-09-23 20:07:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:44.507595 | orchestrator | 2025-09-23 20:07:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:44.507716 | orchestrator | 2025-09-23 20:07:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:47.549481 | orchestrator | 2025-09-23 20:07:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:47.550871 | orchestrator | 2025-09-23 20:07:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:47.551267 | orchestrator | 2025-09-23 20:07:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:50.598921 | orchestrator | 2025-09-23 20:07:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:50.601011 | orchestrator | 2025-09-23 20:07:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:50.601032 | orchestrator | 2025-09-23 20:07:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:53.641970 | orchestrator | 2025-09-23 20:07:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:53.643407 | orchestrator | 2025-09-23 20:07:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:53.643455 | orchestrator | 2025-09-23 20:07:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:56.685226 | orchestrator | 2025-09-23 20:07:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:56.687018 | orchestrator | 2025-09-23 20:07:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:56.687059 | orchestrator | 2025-09-23 20:07:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:07:59.739173 | orchestrator | 2025-09-23 20:07:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:07:59.741075 | orchestrator | 2025-09-23 20:07:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:07:59.741113 | orchestrator | 2025-09-23 20:07:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:02.785295 | orchestrator | 2025-09-23 20:08:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:02.789110 | orchestrator | 2025-09-23 20:08:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:02.789507 | orchestrator | 2025-09-23 20:08:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:05.840250 | orchestrator | 2025-09-23 20:08:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:05.843339 | orchestrator | 2025-09-23 20:08:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:05.843395 | orchestrator | 2025-09-23 20:08:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:08.889939 | orchestrator | 2025-09-23 20:08:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:08.891305 | orchestrator | 2025-09-23 20:08:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:08.891401 | orchestrator | 2025-09-23 20:08:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:11.933665 | orchestrator | 2025-09-23 20:08:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:11.934581 | orchestrator | 2025-09-23 20:08:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:11.934612 | orchestrator | 2025-09-23 20:08:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:14.981778 | orchestrator | 2025-09-23 20:08:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:14.982848 | orchestrator | 2025-09-23 20:08:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:14.982887 | orchestrator | 2025-09-23 20:08:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:18.029524 | orchestrator | 2025-09-23 20:08:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:18.030237 | orchestrator | 2025-09-23 20:08:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:18.030281 | orchestrator | 2025-09-23 20:08:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:21.087913 | orchestrator | 2025-09-23 20:08:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:21.088567 | orchestrator | 2025-09-23 20:08:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:21.088599 | orchestrator | 2025-09-23 20:08:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:24.132069 | orchestrator | 2025-09-23 20:08:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:24.133993 | orchestrator | 2025-09-23 20:08:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:24.134142 | orchestrator | 2025-09-23 20:08:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:27.184417 | orchestrator | 2025-09-23 20:08:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:27.185799 | orchestrator | 2025-09-23 20:08:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:27.186093 | orchestrator | 2025-09-23 20:08:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:30.235858 | orchestrator | 2025-09-23 20:08:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:30.238625 | orchestrator | 2025-09-23 20:08:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:30.238841 | orchestrator | 2025-09-23 20:08:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:33.280016 | orchestrator | 2025-09-23 20:08:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:33.280869 | orchestrator | 2025-09-23 20:08:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:33.280990 | orchestrator | 2025-09-23 20:08:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:36.324746 | orchestrator | 2025-09-23 20:08:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:36.325916 | orchestrator | 2025-09-23 20:08:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:36.325955 | orchestrator | 2025-09-23 20:08:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:39.372273 | orchestrator | 2025-09-23 20:08:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:39.374295 | orchestrator | 2025-09-23 20:08:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:39.374393 | orchestrator | 2025-09-23 20:08:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:42.420157 | orchestrator | 2025-09-23 20:08:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:42.421613 | orchestrator | 2025-09-23 20:08:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:42.421686 | orchestrator | 2025-09-23 20:08:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:45.470900 | orchestrator | 2025-09-23 20:08:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:45.471941 | orchestrator | 2025-09-23 20:08:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:45.472059 | orchestrator | 2025-09-23 20:08:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:48.514644 | orchestrator | 2025-09-23 20:08:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:48.516554 | orchestrator | 2025-09-23 20:08:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:48.516592 | orchestrator | 2025-09-23 20:08:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:51.558717 | orchestrator | 2025-09-23 20:08:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:51.559771 | orchestrator | 2025-09-23 20:08:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:51.559806 | orchestrator | 2025-09-23 20:08:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:54.605635 | orchestrator | 2025-09-23 20:08:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:54.607395 | orchestrator | 2025-09-23 20:08:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:54.607442 | orchestrator | 2025-09-23 20:08:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:08:57.645658 | orchestrator | 2025-09-23 20:08:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:08:57.647358 | orchestrator | 2025-09-23 20:08:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:08:57.647393 | orchestrator | 2025-09-23 20:08:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:00.701903 | orchestrator | 2025-09-23 20:09:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:00.704092 | orchestrator | 2025-09-23 20:09:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:00.704181 | orchestrator | 2025-09-23 20:09:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:03.754495 | orchestrator | 2025-09-23 20:09:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:03.756316 | orchestrator | 2025-09-23 20:09:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:03.756364 | orchestrator | 2025-09-23 20:09:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:06.805478 | orchestrator | 2025-09-23 20:09:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:06.808257 | orchestrator | 2025-09-23 20:09:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:06.808293 | orchestrator | 2025-09-23 20:09:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:09.851320 | orchestrator | 2025-09-23 20:09:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:09.853273 | orchestrator | 2025-09-23 20:09:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:09.853356 | orchestrator | 2025-09-23 20:09:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:12.897601 | orchestrator | 2025-09-23 20:09:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:12.899351 | orchestrator | 2025-09-23 20:09:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:12.899400 | orchestrator | 2025-09-23 20:09:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:15.947008 | orchestrator | 2025-09-23 20:09:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:15.948284 | orchestrator | 2025-09-23 20:09:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:15.948319 | orchestrator | 2025-09-23 20:09:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:18.993640 | orchestrator | 2025-09-23 20:09:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:18.995016 | orchestrator | 2025-09-23 20:09:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:18.995146 | orchestrator | 2025-09-23 20:09:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:22.041502 | orchestrator | 2025-09-23 20:09:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:22.042891 | orchestrator | 2025-09-23 20:09:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:22.043262 | orchestrator | 2025-09-23 20:09:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:25.090383 | orchestrator | 2025-09-23 20:09:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:25.091506 | orchestrator | 2025-09-23 20:09:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:25.091891 | orchestrator | 2025-09-23 20:09:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:28.138804 | orchestrator | 2025-09-23 20:09:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:28.140518 | orchestrator | 2025-09-23 20:09:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:28.140568 | orchestrator | 2025-09-23 20:09:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:31.189431 | orchestrator | 2025-09-23 20:09:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:31.190598 | orchestrator | 2025-09-23 20:09:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:31.190633 | orchestrator | 2025-09-23 20:09:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:34.237903 | orchestrator | 2025-09-23 20:09:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:34.240261 | orchestrator | 2025-09-23 20:09:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:34.240361 | orchestrator | 2025-09-23 20:09:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:37.276541 | orchestrator | 2025-09-23 20:09:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:37.277991 | orchestrator | 2025-09-23 20:09:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:37.278137 | orchestrator | 2025-09-23 20:09:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:40.322577 | orchestrator | 2025-09-23 20:09:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:40.324012 | orchestrator | 2025-09-23 20:09:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:40.324315 | orchestrator | 2025-09-23 20:09:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:43.373480 | orchestrator | 2025-09-23 20:09:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:43.375404 | orchestrator | 2025-09-23 20:09:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:43.375445 | orchestrator | 2025-09-23 20:09:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:46.420929 | orchestrator | 2025-09-23 20:09:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:46.422931 | orchestrator | 2025-09-23 20:09:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:46.422992 | orchestrator | 2025-09-23 20:09:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:49.456838 | orchestrator | 2025-09-23 20:09:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:49.458149 | orchestrator | 2025-09-23 20:09:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:49.458184 | orchestrator | 2025-09-23 20:09:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:52.502174 | orchestrator | 2025-09-23 20:09:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:52.504551 | orchestrator | 2025-09-23 20:09:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:52.504925 | orchestrator | 2025-09-23 20:09:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:55.552958 | orchestrator | 2025-09-23 20:09:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:55.556629 | orchestrator | 2025-09-23 20:09:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:55.556739 | orchestrator | 2025-09-23 20:09:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:09:58.603664 | orchestrator | 2025-09-23 20:09:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:09:58.604556 | orchestrator | 2025-09-23 20:09:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:09:58.604590 | orchestrator | 2025-09-23 20:09:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:01.649635 | orchestrator | 2025-09-23 20:10:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:01.650659 | orchestrator | 2025-09-23 20:10:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:01.650742 | orchestrator | 2025-09-23 20:10:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:04.703974 | orchestrator | 2025-09-23 20:10:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:04.706013 | orchestrator | 2025-09-23 20:10:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:04.706193 | orchestrator | 2025-09-23 20:10:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:07.747626 | orchestrator | 2025-09-23 20:10:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:07.748855 | orchestrator | 2025-09-23 20:10:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:07.749094 | orchestrator | 2025-09-23 20:10:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:10.788630 | orchestrator | 2025-09-23 20:10:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:10.789578 | orchestrator | 2025-09-23 20:10:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:10.789628 | orchestrator | 2025-09-23 20:10:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:13.834235 | orchestrator | 2025-09-23 20:10:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:13.835320 | orchestrator | 2025-09-23 20:10:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:13.835379 | orchestrator | 2025-09-23 20:10:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:16.880659 | orchestrator | 2025-09-23 20:10:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:16.881807 | orchestrator | 2025-09-23 20:10:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:16.881922 | orchestrator | 2025-09-23 20:10:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:19.931205 | orchestrator | 2025-09-23 20:10:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:19.932519 | orchestrator | 2025-09-23 20:10:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:19.932623 | orchestrator | 2025-09-23 20:10:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:22.980649 | orchestrator | 2025-09-23 20:10:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:22.982558 | orchestrator | 2025-09-23 20:10:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:22.982593 | orchestrator | 2025-09-23 20:10:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:26.021940 | orchestrator | 2025-09-23 20:10:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:26.022763 | orchestrator | 2025-09-23 20:10:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:26.022800 | orchestrator | 2025-09-23 20:10:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:29.068885 | orchestrator | 2025-09-23 20:10:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:29.071688 | orchestrator | 2025-09-23 20:10:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:29.071767 | orchestrator | 2025-09-23 20:10:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:32.116386 | orchestrator | 2025-09-23 20:10:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:32.117922 | orchestrator | 2025-09-23 20:10:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:32.118147 | orchestrator | 2025-09-23 20:10:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:35.161700 | orchestrator | 2025-09-23 20:10:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:35.163278 | orchestrator | 2025-09-23 20:10:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:35.163346 | orchestrator | 2025-09-23 20:10:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:38.203582 | orchestrator | 2025-09-23 20:10:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:38.204954 | orchestrator | 2025-09-23 20:10:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:38.205038 | orchestrator | 2025-09-23 20:10:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:41.254352 | orchestrator | 2025-09-23 20:10:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:41.255495 | orchestrator | 2025-09-23 20:10:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:41.255584 | orchestrator | 2025-09-23 20:10:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:44.301037 | orchestrator | 2025-09-23 20:10:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:44.302487 | orchestrator | 2025-09-23 20:10:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:44.302527 | orchestrator | 2025-09-23 20:10:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:47.349896 | orchestrator | 2025-09-23 20:10:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:47.351199 | orchestrator | 2025-09-23 20:10:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:47.351293 | orchestrator | 2025-09-23 20:10:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:50.395501 | orchestrator | 2025-09-23 20:10:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:50.396742 | orchestrator | 2025-09-23 20:10:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:50.396779 | orchestrator | 2025-09-23 20:10:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:53.440647 | orchestrator | 2025-09-23 20:10:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:53.442536 | orchestrator | 2025-09-23 20:10:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:53.442818 | orchestrator | 2025-09-23 20:10:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:56.492544 | orchestrator | 2025-09-23 20:10:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:56.494548 | orchestrator | 2025-09-23 20:10:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:56.494587 | orchestrator | 2025-09-23 20:10:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:10:59.539143 | orchestrator | 2025-09-23 20:10:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:10:59.541244 | orchestrator | 2025-09-23 20:10:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:10:59.541292 | orchestrator | 2025-09-23 20:10:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:02.582637 | orchestrator | 2025-09-23 20:11:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:02.584841 | orchestrator | 2025-09-23 20:11:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:02.584872 | orchestrator | 2025-09-23 20:11:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:05.630107 | orchestrator | 2025-09-23 20:11:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:05.633301 | orchestrator | 2025-09-23 20:11:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:05.633334 | orchestrator | 2025-09-23 20:11:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:08.677907 | orchestrator | 2025-09-23 20:11:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:08.679135 | orchestrator | 2025-09-23 20:11:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:08.679177 | orchestrator | 2025-09-23 20:11:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:11.724923 | orchestrator | 2025-09-23 20:11:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:11.726763 | orchestrator | 2025-09-23 20:11:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:11.726818 | orchestrator | 2025-09-23 20:11:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:14.775768 | orchestrator | 2025-09-23 20:11:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:14.777823 | orchestrator | 2025-09-23 20:11:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:14.777861 | orchestrator | 2025-09-23 20:11:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:17.822834 | orchestrator | 2025-09-23 20:11:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:17.824793 | orchestrator | 2025-09-23 20:11:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:17.824825 | orchestrator | 2025-09-23 20:11:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:20.870784 | orchestrator | 2025-09-23 20:11:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:20.872057 | orchestrator | 2025-09-23 20:11:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:20.872104 | orchestrator | 2025-09-23 20:11:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:23.916670 | orchestrator | 2025-09-23 20:11:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:23.917919 | orchestrator | 2025-09-23 20:11:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:23.918276 | orchestrator | 2025-09-23 20:11:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:26.958497 | orchestrator | 2025-09-23 20:11:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:26.959404 | orchestrator | 2025-09-23 20:11:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:26.959482 | orchestrator | 2025-09-23 20:11:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:30.001837 | orchestrator | 2025-09-23 20:11:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:30.003331 | orchestrator | 2025-09-23 20:11:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:30.003426 | orchestrator | 2025-09-23 20:11:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:33.052055 | orchestrator | 2025-09-23 20:11:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:33.052284 | orchestrator | 2025-09-23 20:11:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:33.052307 | orchestrator | 2025-09-23 20:11:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:36.098982 | orchestrator | 2025-09-23 20:11:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:36.101402 | orchestrator | 2025-09-23 20:11:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:36.101717 | orchestrator | 2025-09-23 20:11:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:39.152016 | orchestrator | 2025-09-23 20:11:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:39.153882 | orchestrator | 2025-09-23 20:11:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:39.154104 | orchestrator | 2025-09-23 20:11:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:42.201488 | orchestrator | 2025-09-23 20:11:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:42.203474 | orchestrator | 2025-09-23 20:11:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:42.203512 | orchestrator | 2025-09-23 20:11:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:45.251029 | orchestrator | 2025-09-23 20:11:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:45.252106 | orchestrator | 2025-09-23 20:11:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:45.252141 | orchestrator | 2025-09-23 20:11:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:48.296045 | orchestrator | 2025-09-23 20:11:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:48.298157 | orchestrator | 2025-09-23 20:11:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:48.298246 | orchestrator | 2025-09-23 20:11:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:51.342136 | orchestrator | 2025-09-23 20:11:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:51.343678 | orchestrator | 2025-09-23 20:11:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:51.343743 | orchestrator | 2025-09-23 20:11:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:54.389975 | orchestrator | 2025-09-23 20:11:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:54.390699 | orchestrator | 2025-09-23 20:11:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:54.392016 | orchestrator | 2025-09-23 20:11:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:11:57.439968 | orchestrator | 2025-09-23 20:11:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:11:57.441488 | orchestrator | 2025-09-23 20:11:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:11:57.441690 | orchestrator | 2025-09-23 20:11:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:00.486184 | orchestrator | 2025-09-23 20:12:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:00.487967 | orchestrator | 2025-09-23 20:12:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:00.488002 | orchestrator | 2025-09-23 20:12:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:03.532689 | orchestrator | 2025-09-23 20:12:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:03.534594 | orchestrator | 2025-09-23 20:12:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:03.534699 | orchestrator | 2025-09-23 20:12:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:06.575535 | orchestrator | 2025-09-23 20:12:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:06.576623 | orchestrator | 2025-09-23 20:12:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:06.577119 | orchestrator | 2025-09-23 20:12:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:09.622156 | orchestrator | 2025-09-23 20:12:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:09.628670 | orchestrator | 2025-09-23 20:12:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:09.628705 | orchestrator | 2025-09-23 20:12:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:12.666544 | orchestrator | 2025-09-23 20:12:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:12.668450 | orchestrator | 2025-09-23 20:12:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:12.668576 | orchestrator | 2025-09-23 20:12:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:15.708395 | orchestrator | 2025-09-23 20:12:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:15.709971 | orchestrator | 2025-09-23 20:12:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:15.710006 | orchestrator | 2025-09-23 20:12:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:18.752569 | orchestrator | 2025-09-23 20:12:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:18.754143 | orchestrator | 2025-09-23 20:12:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:18.754781 | orchestrator | 2025-09-23 20:12:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:21.803842 | orchestrator | 2025-09-23 20:12:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:21.804810 | orchestrator | 2025-09-23 20:12:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:21.804842 | orchestrator | 2025-09-23 20:12:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:24.859416 | orchestrator | 2025-09-23 20:12:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:24.860764 | orchestrator | 2025-09-23 20:12:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:24.860802 | orchestrator | 2025-09-23 20:12:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:27.904380 | orchestrator | 2025-09-23 20:12:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:27.907963 | orchestrator | 2025-09-23 20:12:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:27.908047 | orchestrator | 2025-09-23 20:12:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:30.954778 | orchestrator | 2025-09-23 20:12:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:30.956753 | orchestrator | 2025-09-23 20:12:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:30.956786 | orchestrator | 2025-09-23 20:12:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:34.005886 | orchestrator | 2025-09-23 20:12:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:34.007081 | orchestrator | 2025-09-23 20:12:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:34.007121 | orchestrator | 2025-09-23 20:12:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:37.049615 | orchestrator | 2025-09-23 20:12:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:37.051045 | orchestrator | 2025-09-23 20:12:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:37.051079 | orchestrator | 2025-09-23 20:12:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:40.095813 | orchestrator | 2025-09-23 20:12:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:40.097706 | orchestrator | 2025-09-23 20:12:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:40.097767 | orchestrator | 2025-09-23 20:12:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:43.139143 | orchestrator | 2025-09-23 20:12:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:43.140359 | orchestrator | 2025-09-23 20:12:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:43.140703 | orchestrator | 2025-09-23 20:12:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:46.185092 | orchestrator | 2025-09-23 20:12:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:46.187156 | orchestrator | 2025-09-23 20:12:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:46.187284 | orchestrator | 2025-09-23 20:12:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:49.226086 | orchestrator | 2025-09-23 20:12:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:49.228162 | orchestrator | 2025-09-23 20:12:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:49.228629 | orchestrator | 2025-09-23 20:12:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:52.272788 | orchestrator | 2025-09-23 20:12:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:52.274399 | orchestrator | 2025-09-23 20:12:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:52.274491 | orchestrator | 2025-09-23 20:12:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:55.321272 | orchestrator | 2025-09-23 20:12:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:55.322121 | orchestrator | 2025-09-23 20:12:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:55.322153 | orchestrator | 2025-09-23 20:12:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:12:58.368340 | orchestrator | 2025-09-23 20:12:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:12:58.369415 | orchestrator | 2025-09-23 20:12:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:12:58.369450 | orchestrator | 2025-09-23 20:12:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:01.415455 | orchestrator | 2025-09-23 20:13:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:01.417756 | orchestrator | 2025-09-23 20:13:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:01.417852 | orchestrator | 2025-09-23 20:13:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:04.453121 | orchestrator | 2025-09-23 20:13:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:04.455833 | orchestrator | 2025-09-23 20:13:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:04.455909 | orchestrator | 2025-09-23 20:13:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:07.502885 | orchestrator | 2025-09-23 20:13:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:07.504941 | orchestrator | 2025-09-23 20:13:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:07.504969 | orchestrator | 2025-09-23 20:13:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:10.552011 | orchestrator | 2025-09-23 20:13:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:10.555198 | orchestrator | 2025-09-23 20:13:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:10.555670 | orchestrator | 2025-09-23 20:13:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:13.605846 | orchestrator | 2025-09-23 20:13:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:13.606590 | orchestrator | 2025-09-23 20:13:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:13.606622 | orchestrator | 2025-09-23 20:13:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:16.644104 | orchestrator | 2025-09-23 20:13:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:16.645660 | orchestrator | 2025-09-23 20:13:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:16.645695 | orchestrator | 2025-09-23 20:13:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:19.688766 | orchestrator | 2025-09-23 20:13:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:19.689844 | orchestrator | 2025-09-23 20:13:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:19.689887 | orchestrator | 2025-09-23 20:13:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:22.736353 | orchestrator | 2025-09-23 20:13:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:22.738541 | orchestrator | 2025-09-23 20:13:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:22.738586 | orchestrator | 2025-09-23 20:13:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:25.782290 | orchestrator | 2025-09-23 20:13:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:25.784048 | orchestrator | 2025-09-23 20:13:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:25.784085 | orchestrator | 2025-09-23 20:13:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:28.833237 | orchestrator | 2025-09-23 20:13:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:28.834527 | orchestrator | 2025-09-23 20:13:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:28.834582 | orchestrator | 2025-09-23 20:13:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:31.883485 | orchestrator | 2025-09-23 20:13:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:31.885042 | orchestrator | 2025-09-23 20:13:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:31.885079 | orchestrator | 2025-09-23 20:13:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:34.928567 | orchestrator | 2025-09-23 20:13:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:34.930285 | orchestrator | 2025-09-23 20:13:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:34.930340 | orchestrator | 2025-09-23 20:13:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:37.972452 | orchestrator | 2025-09-23 20:13:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:37.972912 | orchestrator | 2025-09-23 20:13:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:37.972946 | orchestrator | 2025-09-23 20:13:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:41.017540 | orchestrator | 2025-09-23 20:13:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:41.020176 | orchestrator | 2025-09-23 20:13:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:41.020212 | orchestrator | 2025-09-23 20:13:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:44.060196 | orchestrator | 2025-09-23 20:13:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:44.061677 | orchestrator | 2025-09-23 20:13:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:44.061754 | orchestrator | 2025-09-23 20:13:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:47.101084 | orchestrator | 2025-09-23 20:13:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:47.103291 | orchestrator | 2025-09-23 20:13:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:47.103359 | orchestrator | 2025-09-23 20:13:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:50.149139 | orchestrator | 2025-09-23 20:13:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:50.149250 | orchestrator | 2025-09-23 20:13:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:50.149266 | orchestrator | 2025-09-23 20:13:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:53.194450 | orchestrator | 2025-09-23 20:13:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:53.196193 | orchestrator | 2025-09-23 20:13:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:53.196226 | orchestrator | 2025-09-23 20:13:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:56.246007 | orchestrator | 2025-09-23 20:13:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:56.247153 | orchestrator | 2025-09-23 20:13:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:56.247311 | orchestrator | 2025-09-23 20:13:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:13:59.291265 | orchestrator | 2025-09-23 20:13:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:13:59.293048 | orchestrator | 2025-09-23 20:13:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:13:59.293083 | orchestrator | 2025-09-23 20:13:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:02.336639 | orchestrator | 2025-09-23 20:14:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:02.337970 | orchestrator | 2025-09-23 20:14:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:02.338011 | orchestrator | 2025-09-23 20:14:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:05.385764 | orchestrator | 2025-09-23 20:14:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:05.386839 | orchestrator | 2025-09-23 20:14:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:05.387044 | orchestrator | 2025-09-23 20:14:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:08.434827 | orchestrator | 2025-09-23 20:14:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:08.436375 | orchestrator | 2025-09-23 20:14:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:08.436410 | orchestrator | 2025-09-23 20:14:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:11.482743 | orchestrator | 2025-09-23 20:14:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:11.483377 | orchestrator | 2025-09-23 20:14:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:11.483434 | orchestrator | 2025-09-23 20:14:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:14.531059 | orchestrator | 2025-09-23 20:14:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:14.532002 | orchestrator | 2025-09-23 20:14:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:14.532036 | orchestrator | 2025-09-23 20:14:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:17.574638 | orchestrator | 2025-09-23 20:14:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:17.576855 | orchestrator | 2025-09-23 20:14:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:17.577020 | orchestrator | 2025-09-23 20:14:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:20.615220 | orchestrator | 2025-09-23 20:14:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:20.616531 | orchestrator | 2025-09-23 20:14:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:20.616674 | orchestrator | 2025-09-23 20:14:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:23.658347 | orchestrator | 2025-09-23 20:14:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:23.661350 | orchestrator | 2025-09-23 20:14:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:23.661468 | orchestrator | 2025-09-23 20:14:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:26.709344 | orchestrator | 2025-09-23 20:14:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:26.711259 | orchestrator | 2025-09-23 20:14:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:26.711586 | orchestrator | 2025-09-23 20:14:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:29.756410 | orchestrator | 2025-09-23 20:14:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:29.757647 | orchestrator | 2025-09-23 20:14:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:29.757729 | orchestrator | 2025-09-23 20:14:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:32.797455 | orchestrator | 2025-09-23 20:14:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:32.799261 | orchestrator | 2025-09-23 20:14:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:32.799297 | orchestrator | 2025-09-23 20:14:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:35.843723 | orchestrator | 2025-09-23 20:14:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:35.844670 | orchestrator | 2025-09-23 20:14:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:35.844702 | orchestrator | 2025-09-23 20:14:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:38.890099 | orchestrator | 2025-09-23 20:14:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:38.892625 | orchestrator | 2025-09-23 20:14:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:38.892716 | orchestrator | 2025-09-23 20:14:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:41.937767 | orchestrator | 2025-09-23 20:14:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:41.939919 | orchestrator | 2025-09-23 20:14:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:41.940088 | orchestrator | 2025-09-23 20:14:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:44.981451 | orchestrator | 2025-09-23 20:14:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:44.982862 | orchestrator | 2025-09-23 20:14:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:44.982911 | orchestrator | 2025-09-23 20:14:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:48.034145 | orchestrator | 2025-09-23 20:14:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:48.035675 | orchestrator | 2025-09-23 20:14:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:48.035691 | orchestrator | 2025-09-23 20:14:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:51.078232 | orchestrator | 2025-09-23 20:14:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:51.079138 | orchestrator | 2025-09-23 20:14:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:51.079455 | orchestrator | 2025-09-23 20:14:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:54.125547 | orchestrator | 2025-09-23 20:14:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:54.127161 | orchestrator | 2025-09-23 20:14:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:54.127202 | orchestrator | 2025-09-23 20:14:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:14:57.165563 | orchestrator | 2025-09-23 20:14:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:14:57.166889 | orchestrator | 2025-09-23 20:14:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:14:57.167375 | orchestrator | 2025-09-23 20:14:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:00.208768 | orchestrator | 2025-09-23 20:15:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:00.210556 | orchestrator | 2025-09-23 20:15:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:00.210644 | orchestrator | 2025-09-23 20:15:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:03.260619 | orchestrator | 2025-09-23 20:15:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:03.262218 | orchestrator | 2025-09-23 20:15:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:03.262601 | orchestrator | 2025-09-23 20:15:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:06.311996 | orchestrator | 2025-09-23 20:15:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:06.314970 | orchestrator | 2025-09-23 20:15:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:06.315004 | orchestrator | 2025-09-23 20:15:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:09.358915 | orchestrator | 2025-09-23 20:15:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:09.361413 | orchestrator | 2025-09-23 20:15:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:09.361488 | orchestrator | 2025-09-23 20:15:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:12.397873 | orchestrator | 2025-09-23 20:15:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:12.399568 | orchestrator | 2025-09-23 20:15:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:12.399666 | orchestrator | 2025-09-23 20:15:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:15.442285 | orchestrator | 2025-09-23 20:15:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:15.443386 | orchestrator | 2025-09-23 20:15:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:15.443411 | orchestrator | 2025-09-23 20:15:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:18.487428 | orchestrator | 2025-09-23 20:15:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:18.489159 | orchestrator | 2025-09-23 20:15:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:18.489193 | orchestrator | 2025-09-23 20:15:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:21.525326 | orchestrator | 2025-09-23 20:15:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:21.526181 | orchestrator | 2025-09-23 20:15:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:21.526480 | orchestrator | 2025-09-23 20:15:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:24.570120 | orchestrator | 2025-09-23 20:15:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:24.571212 | orchestrator | 2025-09-23 20:15:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:24.571274 | orchestrator | 2025-09-23 20:15:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:27.612386 | orchestrator | 2025-09-23 20:15:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:27.613646 | orchestrator | 2025-09-23 20:15:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:27.613685 | orchestrator | 2025-09-23 20:15:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:30.657390 | orchestrator | 2025-09-23 20:15:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:30.659041 | orchestrator | 2025-09-23 20:15:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:30.659289 | orchestrator | 2025-09-23 20:15:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:33.700394 | orchestrator | 2025-09-23 20:15:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:33.701738 | orchestrator | 2025-09-23 20:15:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:33.701922 | orchestrator | 2025-09-23 20:15:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:36.746295 | orchestrator | 2025-09-23 20:15:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:36.747876 | orchestrator | 2025-09-23 20:15:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:36.748295 | orchestrator | 2025-09-23 20:15:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:39.797868 | orchestrator | 2025-09-23 20:15:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:39.799530 | orchestrator | 2025-09-23 20:15:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:39.799619 | orchestrator | 2025-09-23 20:15:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:42.848349 | orchestrator | 2025-09-23 20:15:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:42.849507 | orchestrator | 2025-09-23 20:15:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:42.849547 | orchestrator | 2025-09-23 20:15:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:45.893693 | orchestrator | 2025-09-23 20:15:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:45.895883 | orchestrator | 2025-09-23 20:15:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:45.895975 | orchestrator | 2025-09-23 20:15:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:48.945070 | orchestrator | 2025-09-23 20:15:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:48.946179 | orchestrator | 2025-09-23 20:15:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:48.946212 | orchestrator | 2025-09-23 20:15:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:51.987232 | orchestrator | 2025-09-23 20:15:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:51.990188 | orchestrator | 2025-09-23 20:15:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:51.990230 | orchestrator | 2025-09-23 20:15:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:55.041190 | orchestrator | 2025-09-23 20:15:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:55.042855 | orchestrator | 2025-09-23 20:15:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:55.043311 | orchestrator | 2025-09-23 20:15:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:15:58.085322 | orchestrator | 2025-09-23 20:15:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:15:58.086440 | orchestrator | 2025-09-23 20:15:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:15:58.086564 | orchestrator | 2025-09-23 20:15:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:01.131827 | orchestrator | 2025-09-23 20:16:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:01.133434 | orchestrator | 2025-09-23 20:16:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:01.133799 | orchestrator | 2025-09-23 20:16:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:04.179138 | orchestrator | 2025-09-23 20:16:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:04.180637 | orchestrator | 2025-09-23 20:16:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:04.180670 | orchestrator | 2025-09-23 20:16:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:07.223008 | orchestrator | 2025-09-23 20:16:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:07.224414 | orchestrator | 2025-09-23 20:16:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:07.224529 | orchestrator | 2025-09-23 20:16:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:10.266926 | orchestrator | 2025-09-23 20:16:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:10.268466 | orchestrator | 2025-09-23 20:16:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:10.268608 | orchestrator | 2025-09-23 20:16:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:13.316726 | orchestrator | 2025-09-23 20:16:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:13.318238 | orchestrator | 2025-09-23 20:16:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:13.318331 | orchestrator | 2025-09-23 20:16:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:16.360408 | orchestrator | 2025-09-23 20:16:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:16.362234 | orchestrator | 2025-09-23 20:16:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:16.362273 | orchestrator | 2025-09-23 20:16:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:19.404654 | orchestrator | 2025-09-23 20:16:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:19.405947 | orchestrator | 2025-09-23 20:16:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:19.405980 | orchestrator | 2025-09-23 20:16:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:22.448902 | orchestrator | 2025-09-23 20:16:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:22.450141 | orchestrator | 2025-09-23 20:16:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:22.450176 | orchestrator | 2025-09-23 20:16:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:25.491557 | orchestrator | 2025-09-23 20:16:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:25.491977 | orchestrator | 2025-09-23 20:16:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:25.492006 | orchestrator | 2025-09-23 20:16:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:28.538403 | orchestrator | 2025-09-23 20:16:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:28.539898 | orchestrator | 2025-09-23 20:16:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:28.539944 | orchestrator | 2025-09-23 20:16:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:31.585456 | orchestrator | 2025-09-23 20:16:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:31.586401 | orchestrator | 2025-09-23 20:16:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:31.586437 | orchestrator | 2025-09-23 20:16:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:34.630859 | orchestrator | 2025-09-23 20:16:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:34.632110 | orchestrator | 2025-09-23 20:16:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:34.632153 | orchestrator | 2025-09-23 20:16:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:37.678337 | orchestrator | 2025-09-23 20:16:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:37.678836 | orchestrator | 2025-09-23 20:16:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:37.678990 | orchestrator | 2025-09-23 20:16:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:40.723693 | orchestrator | 2025-09-23 20:16:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:40.725501 | orchestrator | 2025-09-23 20:16:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:40.726220 | orchestrator | 2025-09-23 20:16:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:43.775156 | orchestrator | 2025-09-23 20:16:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:43.776114 | orchestrator | 2025-09-23 20:16:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:43.776144 | orchestrator | 2025-09-23 20:16:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:46.824685 | orchestrator | 2025-09-23 20:16:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:46.826211 | orchestrator | 2025-09-23 20:16:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:46.826370 | orchestrator | 2025-09-23 20:16:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:49.874512 | orchestrator | 2025-09-23 20:16:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:49.875802 | orchestrator | 2025-09-23 20:16:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:49.876132 | orchestrator | 2025-09-23 20:16:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:52.921761 | orchestrator | 2025-09-23 20:16:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:52.923345 | orchestrator | 2025-09-23 20:16:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:52.923441 | orchestrator | 2025-09-23 20:16:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:55.965262 | orchestrator | 2025-09-23 20:16:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:55.967563 | orchestrator | 2025-09-23 20:16:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:55.967616 | orchestrator | 2025-09-23 20:16:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:16:59.018331 | orchestrator | 2025-09-23 20:16:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:16:59.018442 | orchestrator | 2025-09-23 20:16:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:16:59.018466 | orchestrator | 2025-09-23 20:16:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:02.057786 | orchestrator | 2025-09-23 20:17:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:02.059454 | orchestrator | 2025-09-23 20:17:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:02.059546 | orchestrator | 2025-09-23 20:17:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:05.105213 | orchestrator | 2025-09-23 20:17:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:05.106759 | orchestrator | 2025-09-23 20:17:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:05.106791 | orchestrator | 2025-09-23 20:17:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:08.141442 | orchestrator | 2025-09-23 20:17:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:08.142817 | orchestrator | 2025-09-23 20:17:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:08.143039 | orchestrator | 2025-09-23 20:17:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:11.192531 | orchestrator | 2025-09-23 20:17:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:11.195607 | orchestrator | 2025-09-23 20:17:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:11.195644 | orchestrator | 2025-09-23 20:17:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:14.242733 | orchestrator | 2025-09-23 20:17:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:14.244393 | orchestrator | 2025-09-23 20:17:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:14.244626 | orchestrator | 2025-09-23 20:17:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:17.295385 | orchestrator | 2025-09-23 20:17:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:17.297275 | orchestrator | 2025-09-23 20:17:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:17.297435 | orchestrator | 2025-09-23 20:17:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:20.348776 | orchestrator | 2025-09-23 20:17:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:20.351861 | orchestrator | 2025-09-23 20:17:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:20.351919 | orchestrator | 2025-09-23 20:17:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:23.397730 | orchestrator | 2025-09-23 20:17:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:23.400242 | orchestrator | 2025-09-23 20:17:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:23.400282 | orchestrator | 2025-09-23 20:17:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:26.441058 | orchestrator | 2025-09-23 20:17:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:26.442103 | orchestrator | 2025-09-23 20:17:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:26.442212 | orchestrator | 2025-09-23 20:17:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:29.488186 | orchestrator | 2025-09-23 20:17:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:29.489387 | orchestrator | 2025-09-23 20:17:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:29.489817 | orchestrator | 2025-09-23 20:17:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:32.538806 | orchestrator | 2025-09-23 20:17:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:32.540524 | orchestrator | 2025-09-23 20:17:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:32.540555 | orchestrator | 2025-09-23 20:17:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:35.590591 | orchestrator | 2025-09-23 20:17:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:35.591905 | orchestrator | 2025-09-23 20:17:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:35.591991 | orchestrator | 2025-09-23 20:17:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:38.632699 | orchestrator | 2025-09-23 20:17:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:38.633601 | orchestrator | 2025-09-23 20:17:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:38.633680 | orchestrator | 2025-09-23 20:17:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:41.680400 | orchestrator | 2025-09-23 20:17:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:41.682320 | orchestrator | 2025-09-23 20:17:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:41.682405 | orchestrator | 2025-09-23 20:17:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:44.722334 | orchestrator | 2025-09-23 20:17:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:44.722470 | orchestrator | 2025-09-23 20:17:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:44.722485 | orchestrator | 2025-09-23 20:17:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:47.769809 | orchestrator | 2025-09-23 20:17:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:47.771618 | orchestrator | 2025-09-23 20:17:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:47.771686 | orchestrator | 2025-09-23 20:17:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:50.816312 | orchestrator | 2025-09-23 20:17:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:50.818565 | orchestrator | 2025-09-23 20:17:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:50.819023 | orchestrator | 2025-09-23 20:17:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:53.862352 | orchestrator | 2025-09-23 20:17:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:53.864517 | orchestrator | 2025-09-23 20:17:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:53.864559 | orchestrator | 2025-09-23 20:17:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:56.909594 | orchestrator | 2025-09-23 20:17:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:56.911356 | orchestrator | 2025-09-23 20:17:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:56.911423 | orchestrator | 2025-09-23 20:17:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:17:59.951675 | orchestrator | 2025-09-23 20:17:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:17:59.953775 | orchestrator | 2025-09-23 20:17:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:17:59.953844 | orchestrator | 2025-09-23 20:17:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:03.001832 | orchestrator | 2025-09-23 20:18:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:03.003524 | orchestrator | 2025-09-23 20:18:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:03.003611 | orchestrator | 2025-09-23 20:18:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:06.054302 | orchestrator | 2025-09-23 20:18:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:06.055502 | orchestrator | 2025-09-23 20:18:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:06.055602 | orchestrator | 2025-09-23 20:18:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:09.104194 | orchestrator | 2025-09-23 20:18:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:09.106508 | orchestrator | 2025-09-23 20:18:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:09.106617 | orchestrator | 2025-09-23 20:18:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:12.154263 | orchestrator | 2025-09-23 20:18:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:12.156633 | orchestrator | 2025-09-23 20:18:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:12.156872 | orchestrator | 2025-09-23 20:18:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:15.201665 | orchestrator | 2025-09-23 20:18:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:15.206636 | orchestrator | 2025-09-23 20:18:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:15.206974 | orchestrator | 2025-09-23 20:18:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:18.261033 | orchestrator | 2025-09-23 20:18:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:18.262856 | orchestrator | 2025-09-23 20:18:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:18.262895 | orchestrator | 2025-09-23 20:18:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:21.312160 | orchestrator | 2025-09-23 20:18:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:21.315157 | orchestrator | 2025-09-23 20:18:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:21.315250 | orchestrator | 2025-09-23 20:18:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:24.363607 | orchestrator | 2025-09-23 20:18:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:24.366575 | orchestrator | 2025-09-23 20:18:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:24.366643 | orchestrator | 2025-09-23 20:18:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:27.420476 | orchestrator | 2025-09-23 20:18:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:27.422677 | orchestrator | 2025-09-23 20:18:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:27.422965 | orchestrator | 2025-09-23 20:18:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:30.467126 | orchestrator | 2025-09-23 20:18:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:30.468232 | orchestrator | 2025-09-23 20:18:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:30.468292 | orchestrator | 2025-09-23 20:18:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:33.515347 | orchestrator | 2025-09-23 20:18:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:33.516659 | orchestrator | 2025-09-23 20:18:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:33.516742 | orchestrator | 2025-09-23 20:18:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:36.559137 | orchestrator | 2025-09-23 20:18:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:36.561011 | orchestrator | 2025-09-23 20:18:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:36.561149 | orchestrator | 2025-09-23 20:18:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:39.601168 | orchestrator | 2025-09-23 20:18:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:39.602751 | orchestrator | 2025-09-23 20:18:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:39.602784 | orchestrator | 2025-09-23 20:18:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:42.648140 | orchestrator | 2025-09-23 20:18:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:42.648941 | orchestrator | 2025-09-23 20:18:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:42.649043 | orchestrator | 2025-09-23 20:18:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:45.696053 | orchestrator | 2025-09-23 20:18:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:45.698010 | orchestrator | 2025-09-23 20:18:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:45.698082 | orchestrator | 2025-09-23 20:18:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:48.743600 | orchestrator | 2025-09-23 20:18:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:48.745476 | orchestrator | 2025-09-23 20:18:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:48.745604 | orchestrator | 2025-09-23 20:18:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:51.795352 | orchestrator | 2025-09-23 20:18:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:51.796719 | orchestrator | 2025-09-23 20:18:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:51.796756 | orchestrator | 2025-09-23 20:18:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:54.841404 | orchestrator | 2025-09-23 20:18:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:54.843901 | orchestrator | 2025-09-23 20:18:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:54.843940 | orchestrator | 2025-09-23 20:18:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:18:57.888083 | orchestrator | 2025-09-23 20:18:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:18:57.890322 | orchestrator | 2025-09-23 20:18:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:18:57.890406 | orchestrator | 2025-09-23 20:18:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:00.936941 | orchestrator | 2025-09-23 20:19:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:00.938728 | orchestrator | 2025-09-23 20:19:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:00.938848 | orchestrator | 2025-09-23 20:19:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:03.987787 | orchestrator | 2025-09-23 20:19:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:03.989091 | orchestrator | 2025-09-23 20:19:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:03.989256 | orchestrator | 2025-09-23 20:19:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:07.034249 | orchestrator | 2025-09-23 20:19:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:07.036246 | orchestrator | 2025-09-23 20:19:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:07.036457 | orchestrator | 2025-09-23 20:19:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:10.077613 | orchestrator | 2025-09-23 20:19:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:10.079387 | orchestrator | 2025-09-23 20:19:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:10.079434 | orchestrator | 2025-09-23 20:19:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:13.125063 | orchestrator | 2025-09-23 20:19:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:13.127837 | orchestrator | 2025-09-23 20:19:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:13.128016 | orchestrator | 2025-09-23 20:19:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:16.179121 | orchestrator | 2025-09-23 20:19:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:16.181226 | orchestrator | 2025-09-23 20:19:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:16.181407 | orchestrator | 2025-09-23 20:19:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:19.228948 | orchestrator | 2025-09-23 20:19:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:19.231230 | orchestrator | 2025-09-23 20:19:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:19.231269 | orchestrator | 2025-09-23 20:19:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:22.271437 | orchestrator | 2025-09-23 20:19:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:22.273026 | orchestrator | 2025-09-23 20:19:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:22.273070 | orchestrator | 2025-09-23 20:19:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:25.317103 | orchestrator | 2025-09-23 20:19:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:25.318599 | orchestrator | 2025-09-23 20:19:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:25.318688 | orchestrator | 2025-09-23 20:19:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:28.362412 | orchestrator | 2025-09-23 20:19:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:28.362674 | orchestrator | 2025-09-23 20:19:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:28.362700 | orchestrator | 2025-09-23 20:19:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:31.398651 | orchestrator | 2025-09-23 20:19:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:31.400335 | orchestrator | 2025-09-23 20:19:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:31.400478 | orchestrator | 2025-09-23 20:19:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:34.446455 | orchestrator | 2025-09-23 20:19:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:34.447688 | orchestrator | 2025-09-23 20:19:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:34.447736 | orchestrator | 2025-09-23 20:19:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:37.487667 | orchestrator | 2025-09-23 20:19:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:37.488832 | orchestrator | 2025-09-23 20:19:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:37.488866 | orchestrator | 2025-09-23 20:19:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:40.531655 | orchestrator | 2025-09-23 20:19:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:40.533089 | orchestrator | 2025-09-23 20:19:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:40.533121 | orchestrator | 2025-09-23 20:19:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:43.574859 | orchestrator | 2025-09-23 20:19:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:43.576612 | orchestrator | 2025-09-23 20:19:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:43.576747 | orchestrator | 2025-09-23 20:19:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:46.615504 | orchestrator | 2025-09-23 20:19:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:46.617290 | orchestrator | 2025-09-23 20:19:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:46.617325 | orchestrator | 2025-09-23 20:19:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:49.653638 | orchestrator | 2025-09-23 20:19:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:49.655278 | orchestrator | 2025-09-23 20:19:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:49.655636 | orchestrator | 2025-09-23 20:19:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:52.704248 | orchestrator | 2025-09-23 20:19:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:52.706772 | orchestrator | 2025-09-23 20:19:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:52.706817 | orchestrator | 2025-09-23 20:19:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:55.751187 | orchestrator | 2025-09-23 20:19:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:55.752520 | orchestrator | 2025-09-23 20:19:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:55.752671 | orchestrator | 2025-09-23 20:19:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:19:58.797236 | orchestrator | 2025-09-23 20:19:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:19:58.799326 | orchestrator | 2025-09-23 20:19:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:19:58.799361 | orchestrator | 2025-09-23 20:19:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:01.844003 | orchestrator | 2025-09-23 20:20:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:01.845426 | orchestrator | 2025-09-23 20:20:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:01.845477 | orchestrator | 2025-09-23 20:20:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:04.886432 | orchestrator | 2025-09-23 20:20:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:04.888196 | orchestrator | 2025-09-23 20:20:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:04.888238 | orchestrator | 2025-09-23 20:20:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:07.933721 | orchestrator | 2025-09-23 20:20:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:07.935821 | orchestrator | 2025-09-23 20:20:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:07.935871 | orchestrator | 2025-09-23 20:20:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:10.977568 | orchestrator | 2025-09-23 20:20:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:10.978398 | orchestrator | 2025-09-23 20:20:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:10.978430 | orchestrator | 2025-09-23 20:20:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:14.022923 | orchestrator | 2025-09-23 20:20:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:14.025654 | orchestrator | 2025-09-23 20:20:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:14.025690 | orchestrator | 2025-09-23 20:20:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:17.066283 | orchestrator | 2025-09-23 20:20:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:17.068260 | orchestrator | 2025-09-23 20:20:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:17.068310 | orchestrator | 2025-09-23 20:20:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:20.107209 | orchestrator | 2025-09-23 20:20:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:20.108529 | orchestrator | 2025-09-23 20:20:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:20.108701 | orchestrator | 2025-09-23 20:20:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:23.150566 | orchestrator | 2025-09-23 20:20:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:23.152146 | orchestrator | 2025-09-23 20:20:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:23.152179 | orchestrator | 2025-09-23 20:20:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:26.198887 | orchestrator | 2025-09-23 20:20:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:26.199785 | orchestrator | 2025-09-23 20:20:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:26.199818 | orchestrator | 2025-09-23 20:20:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:29.244299 | orchestrator | 2025-09-23 20:20:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:29.245552 | orchestrator | 2025-09-23 20:20:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:29.245584 | orchestrator | 2025-09-23 20:20:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:32.294543 | orchestrator | 2025-09-23 20:20:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:32.295696 | orchestrator | 2025-09-23 20:20:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:32.295974 | orchestrator | 2025-09-23 20:20:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:35.335840 | orchestrator | 2025-09-23 20:20:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:35.337876 | orchestrator | 2025-09-23 20:20:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:35.338417 | orchestrator | 2025-09-23 20:20:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:38.382339 | orchestrator | 2025-09-23 20:20:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:38.383772 | orchestrator | 2025-09-23 20:20:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:38.383817 | orchestrator | 2025-09-23 20:20:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:41.430141 | orchestrator | 2025-09-23 20:20:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:41.431917 | orchestrator | 2025-09-23 20:20:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:41.432042 | orchestrator | 2025-09-23 20:20:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:44.478440 | orchestrator | 2025-09-23 20:20:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:44.480185 | orchestrator | 2025-09-23 20:20:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:44.480405 | orchestrator | 2025-09-23 20:20:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:47.521776 | orchestrator | 2025-09-23 20:20:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:47.523193 | orchestrator | 2025-09-23 20:20:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:47.523247 | orchestrator | 2025-09-23 20:20:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:50.574645 | orchestrator | 2025-09-23 20:20:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:50.578275 | orchestrator | 2025-09-23 20:20:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:50.578323 | orchestrator | 2025-09-23 20:20:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:53.612924 | orchestrator | 2025-09-23 20:20:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:53.614450 | orchestrator | 2025-09-23 20:20:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:53.614570 | orchestrator | 2025-09-23 20:20:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:56.663129 | orchestrator | 2025-09-23 20:20:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:56.665308 | orchestrator | 2025-09-23 20:20:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:56.665420 | orchestrator | 2025-09-23 20:20:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:20:59.713189 | orchestrator | 2025-09-23 20:20:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:20:59.713888 | orchestrator | 2025-09-23 20:20:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:20:59.713923 | orchestrator | 2025-09-23 20:20:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:02.760923 | orchestrator | 2025-09-23 20:21:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:02.762222 | orchestrator | 2025-09-23 20:21:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:02.762314 | orchestrator | 2025-09-23 20:21:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:05.802901 | orchestrator | 2025-09-23 20:21:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:05.805488 | orchestrator | 2025-09-23 20:21:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:05.805522 | orchestrator | 2025-09-23 20:21:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:08.850210 | orchestrator | 2025-09-23 20:21:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:08.852627 | orchestrator | 2025-09-23 20:21:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:08.852661 | orchestrator | 2025-09-23 20:21:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:11.894631 | orchestrator | 2025-09-23 20:21:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:11.896357 | orchestrator | 2025-09-23 20:21:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:11.896396 | orchestrator | 2025-09-23 20:21:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:14.939556 | orchestrator | 2025-09-23 20:21:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:14.940473 | orchestrator | 2025-09-23 20:21:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:14.940506 | orchestrator | 2025-09-23 20:21:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:17.988001 | orchestrator | 2025-09-23 20:21:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:17.988767 | orchestrator | 2025-09-23 20:21:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:17.988831 | orchestrator | 2025-09-23 20:21:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:21.027507 | orchestrator | 2025-09-23 20:21:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:21.027767 | orchestrator | 2025-09-23 20:21:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:21.027792 | orchestrator | 2025-09-23 20:21:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:24.071550 | orchestrator | 2025-09-23 20:21:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:24.073343 | orchestrator | 2025-09-23 20:21:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:24.073422 | orchestrator | 2025-09-23 20:21:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:27.125345 | orchestrator | 2025-09-23 20:21:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:27.126926 | orchestrator | 2025-09-23 20:21:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:27.127162 | orchestrator | 2025-09-23 20:21:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:30.171257 | orchestrator | 2025-09-23 20:21:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:30.173117 | orchestrator | 2025-09-23 20:21:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:30.173213 | orchestrator | 2025-09-23 20:21:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:33.218427 | orchestrator | 2025-09-23 20:21:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:33.219399 | orchestrator | 2025-09-23 20:21:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:33.219517 | orchestrator | 2025-09-23 20:21:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:36.262639 | orchestrator | 2025-09-23 20:21:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:36.264494 | orchestrator | 2025-09-23 20:21:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:36.264541 | orchestrator | 2025-09-23 20:21:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:39.310353 | orchestrator | 2025-09-23 20:21:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:39.312459 | orchestrator | 2025-09-23 20:21:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:39.312526 | orchestrator | 2025-09-23 20:21:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:42.356640 | orchestrator | 2025-09-23 20:21:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:42.357547 | orchestrator | 2025-09-23 20:21:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:42.357582 | orchestrator | 2025-09-23 20:21:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:45.401027 | orchestrator | 2025-09-23 20:21:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:45.402294 | orchestrator | 2025-09-23 20:21:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:45.402427 | orchestrator | 2025-09-23 20:21:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:48.443954 | orchestrator | 2025-09-23 20:21:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:48.445701 | orchestrator | 2025-09-23 20:21:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:48.445734 | orchestrator | 2025-09-23 20:21:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:51.492949 | orchestrator | 2025-09-23 20:21:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:51.494363 | orchestrator | 2025-09-23 20:21:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:51.494434 | orchestrator | 2025-09-23 20:21:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:54.544202 | orchestrator | 2025-09-23 20:21:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:54.546385 | orchestrator | 2025-09-23 20:21:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:54.546424 | orchestrator | 2025-09-23 20:21:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:21:57.588532 | orchestrator | 2025-09-23 20:21:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:21:57.591030 | orchestrator | 2025-09-23 20:21:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:21:57.591251 | orchestrator | 2025-09-23 20:21:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:00.638172 | orchestrator | 2025-09-23 20:22:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:00.639827 | orchestrator | 2025-09-23 20:22:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:00.639855 | orchestrator | 2025-09-23 20:22:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:03.682078 | orchestrator | 2025-09-23 20:22:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:03.682563 | orchestrator | 2025-09-23 20:22:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:03.682597 | orchestrator | 2025-09-23 20:22:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:06.728047 | orchestrator | 2025-09-23 20:22:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:06.729495 | orchestrator | 2025-09-23 20:22:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:06.729581 | orchestrator | 2025-09-23 20:22:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:09.772639 | orchestrator | 2025-09-23 20:22:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:09.774117 | orchestrator | 2025-09-23 20:22:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:09.774652 | orchestrator | 2025-09-23 20:22:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:12.822916 | orchestrator | 2025-09-23 20:22:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:12.825020 | orchestrator | 2025-09-23 20:22:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:12.825074 | orchestrator | 2025-09-23 20:22:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:15.872953 | orchestrator | 2025-09-23 20:22:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:15.874004 | orchestrator | 2025-09-23 20:22:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:15.874085 | orchestrator | 2025-09-23 20:22:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:18.918241 | orchestrator | 2025-09-23 20:22:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:18.919645 | orchestrator | 2025-09-23 20:22:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:18.919678 | orchestrator | 2025-09-23 20:22:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:21.959918 | orchestrator | 2025-09-23 20:22:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:21.960992 | orchestrator | 2025-09-23 20:22:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:21.961304 | orchestrator | 2025-09-23 20:22:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:25.006485 | orchestrator | 2025-09-23 20:22:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:25.007295 | orchestrator | 2025-09-23 20:22:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:25.007336 | orchestrator | 2025-09-23 20:22:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:28.054847 | orchestrator | 2025-09-23 20:22:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:28.056895 | orchestrator | 2025-09-23 20:22:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:28.056974 | orchestrator | 2025-09-23 20:22:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:31.105540 | orchestrator | 2025-09-23 20:22:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:31.107016 | orchestrator | 2025-09-23 20:22:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:31.107052 | orchestrator | 2025-09-23 20:22:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:34.152346 | orchestrator | 2025-09-23 20:22:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:34.153387 | orchestrator | 2025-09-23 20:22:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:34.153566 | orchestrator | 2025-09-23 20:22:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:37.195286 | orchestrator | 2025-09-23 20:22:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:37.196203 | orchestrator | 2025-09-23 20:22:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:37.196343 | orchestrator | 2025-09-23 20:22:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:40.245087 | orchestrator | 2025-09-23 20:22:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:40.247133 | orchestrator | 2025-09-23 20:22:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:40.247175 | orchestrator | 2025-09-23 20:22:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:43.291489 | orchestrator | 2025-09-23 20:22:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:43.293290 | orchestrator | 2025-09-23 20:22:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:43.293315 | orchestrator | 2025-09-23 20:22:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:46.340273 | orchestrator | 2025-09-23 20:22:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:46.342061 | orchestrator | 2025-09-23 20:22:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:46.342096 | orchestrator | 2025-09-23 20:22:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:49.387740 | orchestrator | 2025-09-23 20:22:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:49.390403 | orchestrator | 2025-09-23 20:22:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:49.390439 | orchestrator | 2025-09-23 20:22:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:52.436533 | orchestrator | 2025-09-23 20:22:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:52.438428 | orchestrator | 2025-09-23 20:22:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:52.438733 | orchestrator | 2025-09-23 20:22:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:55.483320 | orchestrator | 2025-09-23 20:22:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:55.484515 | orchestrator | 2025-09-23 20:22:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:55.484556 | orchestrator | 2025-09-23 20:22:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:22:58.533973 | orchestrator | 2025-09-23 20:22:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:22:58.535438 | orchestrator | 2025-09-23 20:22:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:22:58.535668 | orchestrator | 2025-09-23 20:22:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:01.581107 | orchestrator | 2025-09-23 20:23:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:01.582720 | orchestrator | 2025-09-23 20:23:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:01.582913 | orchestrator | 2025-09-23 20:23:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:04.628322 | orchestrator | 2025-09-23 20:23:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:04.629887 | orchestrator | 2025-09-23 20:23:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:04.629920 | orchestrator | 2025-09-23 20:23:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:07.673917 | orchestrator | 2025-09-23 20:23:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:07.676110 | orchestrator | 2025-09-23 20:23:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:07.676145 | orchestrator | 2025-09-23 20:23:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:10.720791 | orchestrator | 2025-09-23 20:23:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:10.722216 | orchestrator | 2025-09-23 20:23:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:10.722365 | orchestrator | 2025-09-23 20:23:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:13.767223 | orchestrator | 2025-09-23 20:23:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:13.768595 | orchestrator | 2025-09-23 20:23:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:13.768649 | orchestrator | 2025-09-23 20:23:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:16.812713 | orchestrator | 2025-09-23 20:23:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:16.815006 | orchestrator | 2025-09-23 20:23:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:16.815287 | orchestrator | 2025-09-23 20:23:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:19.866705 | orchestrator | 2025-09-23 20:23:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:19.868412 | orchestrator | 2025-09-23 20:23:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:19.868454 | orchestrator | 2025-09-23 20:23:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:22.914940 | orchestrator | 2025-09-23 20:23:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:22.916480 | orchestrator | 2025-09-23 20:23:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:22.916650 | orchestrator | 2025-09-23 20:23:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:25.958211 | orchestrator | 2025-09-23 20:23:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:25.959323 | orchestrator | 2025-09-23 20:23:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:25.959359 | orchestrator | 2025-09-23 20:23:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:28.999953 | orchestrator | 2025-09-23 20:23:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:29.001505 | orchestrator | 2025-09-23 20:23:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:29.001538 | orchestrator | 2025-09-23 20:23:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:32.046222 | orchestrator | 2025-09-23 20:23:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:32.047280 | orchestrator | 2025-09-23 20:23:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:32.047300 | orchestrator | 2025-09-23 20:23:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:35.092228 | orchestrator | 2025-09-23 20:23:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:35.093076 | orchestrator | 2025-09-23 20:23:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:35.093112 | orchestrator | 2025-09-23 20:23:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:38.139394 | orchestrator | 2025-09-23 20:23:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:38.141377 | orchestrator | 2025-09-23 20:23:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:38.141496 | orchestrator | 2025-09-23 20:23:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:41.186317 | orchestrator | 2025-09-23 20:23:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:41.187768 | orchestrator | 2025-09-23 20:23:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:41.187799 | orchestrator | 2025-09-23 20:23:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:44.235210 | orchestrator | 2025-09-23 20:23:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:44.237943 | orchestrator | 2025-09-23 20:23:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:44.238120 | orchestrator | 2025-09-23 20:23:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:47.276774 | orchestrator | 2025-09-23 20:23:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:47.277551 | orchestrator | 2025-09-23 20:23:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:47.277575 | orchestrator | 2025-09-23 20:23:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:50.320199 | orchestrator | 2025-09-23 20:23:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:50.322205 | orchestrator | 2025-09-23 20:23:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:50.322240 | orchestrator | 2025-09-23 20:23:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:53.367357 | orchestrator | 2025-09-23 20:23:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:53.369646 | orchestrator | 2025-09-23 20:23:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:53.369675 | orchestrator | 2025-09-23 20:23:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:56.414512 | orchestrator | 2025-09-23 20:23:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:56.416141 | orchestrator | 2025-09-23 20:23:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:56.416198 | orchestrator | 2025-09-23 20:23:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:23:59.461361 | orchestrator | 2025-09-23 20:23:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:23:59.462652 | orchestrator | 2025-09-23 20:23:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:23:59.463132 | orchestrator | 2025-09-23 20:23:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:02.505140 | orchestrator | 2025-09-23 20:24:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:02.506742 | orchestrator | 2025-09-23 20:24:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:02.506778 | orchestrator | 2025-09-23 20:24:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:05.548568 | orchestrator | 2025-09-23 20:24:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:05.549605 | orchestrator | 2025-09-23 20:24:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:05.549724 | orchestrator | 2025-09-23 20:24:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:08.590643 | orchestrator | 2025-09-23 20:24:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:08.593170 | orchestrator | 2025-09-23 20:24:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:08.593226 | orchestrator | 2025-09-23 20:24:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:11.641873 | orchestrator | 2025-09-23 20:24:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:11.643500 | orchestrator | 2025-09-23 20:24:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:11.643537 | orchestrator | 2025-09-23 20:24:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:14.695844 | orchestrator | 2025-09-23 20:24:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:14.697726 | orchestrator | 2025-09-23 20:24:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:14.697833 | orchestrator | 2025-09-23 20:24:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:17.740514 | orchestrator | 2025-09-23 20:24:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:17.742168 | orchestrator | 2025-09-23 20:24:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:17.742209 | orchestrator | 2025-09-23 20:24:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:20.784685 | orchestrator | 2025-09-23 20:24:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:20.786316 | orchestrator | 2025-09-23 20:24:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:20.786418 | orchestrator | 2025-09-23 20:24:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:23.834460 | orchestrator | 2025-09-23 20:24:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:23.835891 | orchestrator | 2025-09-23 20:24:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:23.835920 | orchestrator | 2025-09-23 20:24:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:26.878185 | orchestrator | 2025-09-23 20:24:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:26.879251 | orchestrator | 2025-09-23 20:24:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:26.879279 | orchestrator | 2025-09-23 20:24:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:29.924123 | orchestrator | 2025-09-23 20:24:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:29.925314 | orchestrator | 2025-09-23 20:24:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:29.925375 | orchestrator | 2025-09-23 20:24:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:32.971012 | orchestrator | 2025-09-23 20:24:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:32.972133 | orchestrator | 2025-09-23 20:24:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:32.972166 | orchestrator | 2025-09-23 20:24:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:36.013667 | orchestrator | 2025-09-23 20:24:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:36.016058 | orchestrator | 2025-09-23 20:24:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:36.016152 | orchestrator | 2025-09-23 20:24:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:39.058384 | orchestrator | 2025-09-23 20:24:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:39.059900 | orchestrator | 2025-09-23 20:24:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:39.059924 | orchestrator | 2025-09-23 20:24:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:42.098652 | orchestrator | 2025-09-23 20:24:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:42.100264 | orchestrator | 2025-09-23 20:24:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:42.100331 | orchestrator | 2025-09-23 20:24:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:45.145282 | orchestrator | 2025-09-23 20:24:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:45.146950 | orchestrator | 2025-09-23 20:24:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:45.146998 | orchestrator | 2025-09-23 20:24:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:48.183392 | orchestrator | 2025-09-23 20:24:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:48.184667 | orchestrator | 2025-09-23 20:24:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:48.184765 | orchestrator | 2025-09-23 20:24:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:51.229652 | orchestrator | 2025-09-23 20:24:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:51.231842 | orchestrator | 2025-09-23 20:24:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:51.231888 | orchestrator | 2025-09-23 20:24:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:54.274647 | orchestrator | 2025-09-23 20:24:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:54.276701 | orchestrator | 2025-09-23 20:24:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:54.276758 | orchestrator | 2025-09-23 20:24:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:24:57.322443 | orchestrator | 2025-09-23 20:24:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:24:57.324887 | orchestrator | 2025-09-23 20:24:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:24:57.325028 | orchestrator | 2025-09-23 20:24:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:00.370947 | orchestrator | 2025-09-23 20:25:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:00.372536 | orchestrator | 2025-09-23 20:25:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:00.372568 | orchestrator | 2025-09-23 20:25:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:03.421491 | orchestrator | 2025-09-23 20:25:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:03.422554 | orchestrator | 2025-09-23 20:25:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:03.423043 | orchestrator | 2025-09-23 20:25:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:06.466780 | orchestrator | 2025-09-23 20:25:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:06.468404 | orchestrator | 2025-09-23 20:25:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:06.468578 | orchestrator | 2025-09-23 20:25:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:09.509670 | orchestrator | 2025-09-23 20:25:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:09.511660 | orchestrator | 2025-09-23 20:25:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:09.511971 | orchestrator | 2025-09-23 20:25:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:12.556131 | orchestrator | 2025-09-23 20:25:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:12.556615 | orchestrator | 2025-09-23 20:25:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:12.556817 | orchestrator | 2025-09-23 20:25:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:15.600741 | orchestrator | 2025-09-23 20:25:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:15.602922 | orchestrator | 2025-09-23 20:25:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:15.602972 | orchestrator | 2025-09-23 20:25:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:18.647218 | orchestrator | 2025-09-23 20:25:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:18.648280 | orchestrator | 2025-09-23 20:25:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:18.648537 | orchestrator | 2025-09-23 20:25:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:21.692594 | orchestrator | 2025-09-23 20:25:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:21.694127 | orchestrator | 2025-09-23 20:25:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:21.694165 | orchestrator | 2025-09-23 20:25:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:24.735598 | orchestrator | 2025-09-23 20:25:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:24.737945 | orchestrator | 2025-09-23 20:25:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:24.738206 | orchestrator | 2025-09-23 20:25:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:27.777066 | orchestrator | 2025-09-23 20:25:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:27.778599 | orchestrator | 2025-09-23 20:25:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:27.778630 | orchestrator | 2025-09-23 20:25:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:30.828554 | orchestrator | 2025-09-23 20:25:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:30.829931 | orchestrator | 2025-09-23 20:25:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:30.829947 | orchestrator | 2025-09-23 20:25:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:33.880196 | orchestrator | 2025-09-23 20:25:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:33.882149 | orchestrator | 2025-09-23 20:25:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:33.882234 | orchestrator | 2025-09-23 20:25:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:36.929619 | orchestrator | 2025-09-23 20:25:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:36.932899 | orchestrator | 2025-09-23 20:25:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:36.932934 | orchestrator | 2025-09-23 20:25:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:39.977484 | orchestrator | 2025-09-23 20:25:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:39.978697 | orchestrator | 2025-09-23 20:25:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:39.978729 | orchestrator | 2025-09-23 20:25:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:43.027640 | orchestrator | 2025-09-23 20:25:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:43.028344 | orchestrator | 2025-09-23 20:25:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:43.028383 | orchestrator | 2025-09-23 20:25:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:46.073810 | orchestrator | 2025-09-23 20:25:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:46.074102 | orchestrator | 2025-09-23 20:25:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:46.074128 | orchestrator | 2025-09-23 20:25:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:49.120880 | orchestrator | 2025-09-23 20:25:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:49.126976 | orchestrator | 2025-09-23 20:25:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:49.127581 | orchestrator | 2025-09-23 20:25:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:52.168928 | orchestrator | 2025-09-23 20:25:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:52.170450 | orchestrator | 2025-09-23 20:25:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:52.170483 | orchestrator | 2025-09-23 20:25:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:55.213216 | orchestrator | 2025-09-23 20:25:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:55.215070 | orchestrator | 2025-09-23 20:25:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:55.215118 | orchestrator | 2025-09-23 20:25:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:25:58.263623 | orchestrator | 2025-09-23 20:25:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:25:58.265521 | orchestrator | 2025-09-23 20:25:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:25:58.265572 | orchestrator | 2025-09-23 20:25:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:01.314607 | orchestrator | 2025-09-23 20:26:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:01.316747 | orchestrator | 2025-09-23 20:26:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:01.317007 | orchestrator | 2025-09-23 20:26:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:04.362205 | orchestrator | 2025-09-23 20:26:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:04.363764 | orchestrator | 2025-09-23 20:26:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:04.363789 | orchestrator | 2025-09-23 20:26:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:07.412604 | orchestrator | 2025-09-23 20:26:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:07.414240 | orchestrator | 2025-09-23 20:26:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:07.414293 | orchestrator | 2025-09-23 20:26:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:10.458406 | orchestrator | 2025-09-23 20:26:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:10.459938 | orchestrator | 2025-09-23 20:26:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:10.460016 | orchestrator | 2025-09-23 20:26:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:13.504559 | orchestrator | 2025-09-23 20:26:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:13.506566 | orchestrator | 2025-09-23 20:26:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:13.506660 | orchestrator | 2025-09-23 20:26:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:16.559312 | orchestrator | 2025-09-23 20:26:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:16.560928 | orchestrator | 2025-09-23 20:26:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:16.560979 | orchestrator | 2025-09-23 20:26:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:19.607742 | orchestrator | 2025-09-23 20:26:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:19.609561 | orchestrator | 2025-09-23 20:26:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:19.609619 | orchestrator | 2025-09-23 20:26:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:22.655342 | orchestrator | 2025-09-23 20:26:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:22.657465 | orchestrator | 2025-09-23 20:26:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:22.657496 | orchestrator | 2025-09-23 20:26:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:25.699960 | orchestrator | 2025-09-23 20:26:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:25.702204 | orchestrator | 2025-09-23 20:26:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:25.702313 | orchestrator | 2025-09-23 20:26:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:28.744621 | orchestrator | 2025-09-23 20:26:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:28.745708 | orchestrator | 2025-09-23 20:26:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:28.745741 | orchestrator | 2025-09-23 20:26:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:31.792903 | orchestrator | 2025-09-23 20:26:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:31.794260 | orchestrator | 2025-09-23 20:26:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:31.794308 | orchestrator | 2025-09-23 20:26:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:34.838293 | orchestrator | 2025-09-23 20:26:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:34.839358 | orchestrator | 2025-09-23 20:26:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:34.839391 | orchestrator | 2025-09-23 20:26:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:37.875996 | orchestrator | 2025-09-23 20:26:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:37.878576 | orchestrator | 2025-09-23 20:26:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:37.878621 | orchestrator | 2025-09-23 20:26:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:40.922457 | orchestrator | 2025-09-23 20:26:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:40.924910 | orchestrator | 2025-09-23 20:26:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:40.924986 | orchestrator | 2025-09-23 20:26:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:43.970966 | orchestrator | 2025-09-23 20:26:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:43.972880 | orchestrator | 2025-09-23 20:26:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:43.972911 | orchestrator | 2025-09-23 20:26:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:47.024439 | orchestrator | 2025-09-23 20:26:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:47.025955 | orchestrator | 2025-09-23 20:26:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:47.026219 | orchestrator | 2025-09-23 20:26:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:50.074212 | orchestrator | 2025-09-23 20:26:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:50.077081 | orchestrator | 2025-09-23 20:26:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:50.077182 | orchestrator | 2025-09-23 20:26:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:53.112272 | orchestrator | 2025-09-23 20:26:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:53.113919 | orchestrator | 2025-09-23 20:26:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:53.113947 | orchestrator | 2025-09-23 20:26:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:56.157329 | orchestrator | 2025-09-23 20:26:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:56.158528 | orchestrator | 2025-09-23 20:26:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:56.158647 | orchestrator | 2025-09-23 20:26:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:26:59.205493 | orchestrator | 2025-09-23 20:26:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:26:59.207532 | orchestrator | 2025-09-23 20:26:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:26:59.207592 | orchestrator | 2025-09-23 20:26:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:02.255595 | orchestrator | 2025-09-23 20:27:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:02.256899 | orchestrator | 2025-09-23 20:27:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:02.256927 | orchestrator | 2025-09-23 20:27:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:05.304260 | orchestrator | 2025-09-23 20:27:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:05.305879 | orchestrator | 2025-09-23 20:27:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:05.305934 | orchestrator | 2025-09-23 20:27:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:08.350296 | orchestrator | 2025-09-23 20:27:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:08.352305 | orchestrator | 2025-09-23 20:27:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:08.352335 | orchestrator | 2025-09-23 20:27:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:11.394786 | orchestrator | 2025-09-23 20:27:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:11.396803 | orchestrator | 2025-09-23 20:27:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:11.396833 | orchestrator | 2025-09-23 20:27:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:14.443942 | orchestrator | 2025-09-23 20:27:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:14.445396 | orchestrator | 2025-09-23 20:27:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:14.445442 | orchestrator | 2025-09-23 20:27:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:17.488466 | orchestrator | 2025-09-23 20:27:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:17.489919 | orchestrator | 2025-09-23 20:27:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:17.489950 | orchestrator | 2025-09-23 20:27:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:20.536591 | orchestrator | 2025-09-23 20:27:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:20.538655 | orchestrator | 2025-09-23 20:27:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:20.538689 | orchestrator | 2025-09-23 20:27:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:23.579594 | orchestrator | 2025-09-23 20:27:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:23.580429 | orchestrator | 2025-09-23 20:27:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:23.580460 | orchestrator | 2025-09-23 20:27:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:26.627551 | orchestrator | 2025-09-23 20:27:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:26.628399 | orchestrator | 2025-09-23 20:27:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:26.628428 | orchestrator | 2025-09-23 20:27:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:29.675642 | orchestrator | 2025-09-23 20:27:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:29.679276 | orchestrator | 2025-09-23 20:27:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:29.679322 | orchestrator | 2025-09-23 20:27:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:32.726257 | orchestrator | 2025-09-23 20:27:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:32.727224 | orchestrator | 2025-09-23 20:27:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:32.727254 | orchestrator | 2025-09-23 20:27:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:35.776585 | orchestrator | 2025-09-23 20:27:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:35.778845 | orchestrator | 2025-09-23 20:27:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:35.778995 | orchestrator | 2025-09-23 20:27:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:38.829395 | orchestrator | 2025-09-23 20:27:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:38.831553 | orchestrator | 2025-09-23 20:27:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:38.831598 | orchestrator | 2025-09-23 20:27:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:41.873634 | orchestrator | 2025-09-23 20:27:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:41.875036 | orchestrator | 2025-09-23 20:27:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:41.875108 | orchestrator | 2025-09-23 20:27:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:44.919510 | orchestrator | 2025-09-23 20:27:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:44.921131 | orchestrator | 2025-09-23 20:27:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:44.921230 | orchestrator | 2025-09-23 20:27:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:47.965108 | orchestrator | 2025-09-23 20:27:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:47.966919 | orchestrator | 2025-09-23 20:27:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:47.967161 | orchestrator | 2025-09-23 20:27:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:51.015969 | orchestrator | 2025-09-23 20:27:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:51.016989 | orchestrator | 2025-09-23 20:27:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:51.017018 | orchestrator | 2025-09-23 20:27:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:54.058387 | orchestrator | 2025-09-23 20:27:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:54.058809 | orchestrator | 2025-09-23 20:27:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:54.058870 | orchestrator | 2025-09-23 20:27:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:27:57.100790 | orchestrator | 2025-09-23 20:27:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:27:57.103204 | orchestrator | 2025-09-23 20:27:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:27:57.103253 | orchestrator | 2025-09-23 20:27:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:00.148381 | orchestrator | 2025-09-23 20:28:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:00.150957 | orchestrator | 2025-09-23 20:28:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:00.151102 | orchestrator | 2025-09-23 20:28:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:03.193684 | orchestrator | 2025-09-23 20:28:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:03.194475 | orchestrator | 2025-09-23 20:28:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:03.194571 | orchestrator | 2025-09-23 20:28:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:06.246328 | orchestrator | 2025-09-23 20:28:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:06.248177 | orchestrator | 2025-09-23 20:28:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:06.248243 | orchestrator | 2025-09-23 20:28:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:09.283349 | orchestrator | 2025-09-23 20:28:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:09.284430 | orchestrator | 2025-09-23 20:28:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:09.284562 | orchestrator | 2025-09-23 20:28:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:12.333513 | orchestrator | 2025-09-23 20:28:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:12.335978 | orchestrator | 2025-09-23 20:28:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:12.336019 | orchestrator | 2025-09-23 20:28:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:15.380868 | orchestrator | 2025-09-23 20:28:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:15.382267 | orchestrator | 2025-09-23 20:28:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:15.382355 | orchestrator | 2025-09-23 20:28:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:18.425551 | orchestrator | 2025-09-23 20:28:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:18.427713 | orchestrator | 2025-09-23 20:28:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:18.427756 | orchestrator | 2025-09-23 20:28:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:21.475518 | orchestrator | 2025-09-23 20:28:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:21.477435 | orchestrator | 2025-09-23 20:28:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:21.477478 | orchestrator | 2025-09-23 20:28:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:24.520357 | orchestrator | 2025-09-23 20:28:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:24.521566 | orchestrator | 2025-09-23 20:28:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:24.521598 | orchestrator | 2025-09-23 20:28:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:27.561205 | orchestrator | 2025-09-23 20:28:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:27.562521 | orchestrator | 2025-09-23 20:28:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:27.562555 | orchestrator | 2025-09-23 20:28:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:30.610334 | orchestrator | 2025-09-23 20:28:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:30.611467 | orchestrator | 2025-09-23 20:28:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:30.611498 | orchestrator | 2025-09-23 20:28:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:33.656147 | orchestrator | 2025-09-23 20:28:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:33.658269 | orchestrator | 2025-09-23 20:28:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:33.658492 | orchestrator | 2025-09-23 20:28:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:36.707769 | orchestrator | 2025-09-23 20:28:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:36.709032 | orchestrator | 2025-09-23 20:28:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:36.709334 | orchestrator | 2025-09-23 20:28:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:39.753008 | orchestrator | 2025-09-23 20:28:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:39.753920 | orchestrator | 2025-09-23 20:28:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:39.753961 | orchestrator | 2025-09-23 20:28:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:42.796941 | orchestrator | 2025-09-23 20:28:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:42.798278 | orchestrator | 2025-09-23 20:28:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:42.798331 | orchestrator | 2025-09-23 20:28:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:45.837014 | orchestrator | 2025-09-23 20:28:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:45.838924 | orchestrator | 2025-09-23 20:28:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:45.838960 | orchestrator | 2025-09-23 20:28:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:48.882365 | orchestrator | 2025-09-23 20:28:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:48.884387 | orchestrator | 2025-09-23 20:28:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:48.884546 | orchestrator | 2025-09-23 20:28:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:51.933596 | orchestrator | 2025-09-23 20:28:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:51.935614 | orchestrator | 2025-09-23 20:28:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:51.935705 | orchestrator | 2025-09-23 20:28:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:54.983804 | orchestrator | 2025-09-23 20:28:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:54.985080 | orchestrator | 2025-09-23 20:28:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:54.985156 | orchestrator | 2025-09-23 20:28:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:28:58.034180 | orchestrator | 2025-09-23 20:28:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:28:58.036341 | orchestrator | 2025-09-23 20:28:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:28:58.036390 | orchestrator | 2025-09-23 20:28:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:01.082831 | orchestrator | 2025-09-23 20:29:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:01.083881 | orchestrator | 2025-09-23 20:29:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:01.083944 | orchestrator | 2025-09-23 20:29:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:04.129239 | orchestrator | 2025-09-23 20:29:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:04.131183 | orchestrator | 2025-09-23 20:29:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:04.131231 | orchestrator | 2025-09-23 20:29:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:07.181342 | orchestrator | 2025-09-23 20:29:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:07.182798 | orchestrator | 2025-09-23 20:29:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:07.183058 | orchestrator | 2025-09-23 20:29:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:10.224990 | orchestrator | 2025-09-23 20:29:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:10.227128 | orchestrator | 2025-09-23 20:29:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:10.227250 | orchestrator | 2025-09-23 20:29:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:13.267700 | orchestrator | 2025-09-23 20:29:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:13.269321 | orchestrator | 2025-09-23 20:29:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:13.269869 | orchestrator | 2025-09-23 20:29:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:16.314340 | orchestrator | 2025-09-23 20:29:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:16.316169 | orchestrator | 2025-09-23 20:29:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:16.316454 | orchestrator | 2025-09-23 20:29:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:19.366780 | orchestrator | 2025-09-23 20:29:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:19.368443 | orchestrator | 2025-09-23 20:29:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:19.368521 | orchestrator | 2025-09-23 20:29:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:22.416217 | orchestrator | 2025-09-23 20:29:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:22.417810 | orchestrator | 2025-09-23 20:29:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:22.417831 | orchestrator | 2025-09-23 20:29:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:25.458263 | orchestrator | 2025-09-23 20:29:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:25.460116 | orchestrator | 2025-09-23 20:29:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:25.460147 | orchestrator | 2025-09-23 20:29:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:28.504150 | orchestrator | 2025-09-23 20:29:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:28.505312 | orchestrator | 2025-09-23 20:29:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:28.505437 | orchestrator | 2025-09-23 20:29:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:31.551519 | orchestrator | 2025-09-23 20:29:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:31.552942 | orchestrator | 2025-09-23 20:29:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:31.553117 | orchestrator | 2025-09-23 20:29:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:34.593544 | orchestrator | 2025-09-23 20:29:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:34.595581 | orchestrator | 2025-09-23 20:29:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:34.595675 | orchestrator | 2025-09-23 20:29:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:37.639125 | orchestrator | 2025-09-23 20:29:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:37.641482 | orchestrator | 2025-09-23 20:29:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:37.641632 | orchestrator | 2025-09-23 20:29:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:40.686562 | orchestrator | 2025-09-23 20:29:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:40.689006 | orchestrator | 2025-09-23 20:29:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:40.689054 | orchestrator | 2025-09-23 20:29:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:43.733970 | orchestrator | 2025-09-23 20:29:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:43.736762 | orchestrator | 2025-09-23 20:29:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:43.737057 | orchestrator | 2025-09-23 20:29:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:46.786488 | orchestrator | 2025-09-23 20:29:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:46.788145 | orchestrator | 2025-09-23 20:29:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:46.788181 | orchestrator | 2025-09-23 20:29:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:49.830163 | orchestrator | 2025-09-23 20:29:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:49.832579 | orchestrator | 2025-09-23 20:29:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:49.832619 | orchestrator | 2025-09-23 20:29:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:52.878269 | orchestrator | 2025-09-23 20:29:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:52.880908 | orchestrator | 2025-09-23 20:29:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:52.880942 | orchestrator | 2025-09-23 20:29:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:55.924640 | orchestrator | 2025-09-23 20:29:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:55.925838 | orchestrator | 2025-09-23 20:29:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:55.925871 | orchestrator | 2025-09-23 20:29:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:29:58.967725 | orchestrator | 2025-09-23 20:29:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:29:58.969452 | orchestrator | 2025-09-23 20:29:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:29:58.969484 | orchestrator | 2025-09-23 20:29:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:02.022099 | orchestrator | 2025-09-23 20:30:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:02.024350 | orchestrator | 2025-09-23 20:30:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:02.024397 | orchestrator | 2025-09-23 20:30:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:05.075595 | orchestrator | 2025-09-23 20:30:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:05.076211 | orchestrator | 2025-09-23 20:30:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:05.076619 | orchestrator | 2025-09-23 20:30:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:08.123087 | orchestrator | 2025-09-23 20:30:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:08.124597 | orchestrator | 2025-09-23 20:30:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:08.124640 | orchestrator | 2025-09-23 20:30:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:11.161918 | orchestrator | 2025-09-23 20:30:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:11.162852 | orchestrator | 2025-09-23 20:30:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:11.162894 | orchestrator | 2025-09-23 20:30:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:14.204440 | orchestrator | 2025-09-23 20:30:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:14.205089 | orchestrator | 2025-09-23 20:30:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:14.205121 | orchestrator | 2025-09-23 20:30:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:17.250780 | orchestrator | 2025-09-23 20:30:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:17.253249 | orchestrator | 2025-09-23 20:30:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:17.253282 | orchestrator | 2025-09-23 20:30:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:20.301533 | orchestrator | 2025-09-23 20:30:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:20.302836 | orchestrator | 2025-09-23 20:30:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:20.302882 | orchestrator | 2025-09-23 20:30:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:23.347802 | orchestrator | 2025-09-23 20:30:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:23.348170 | orchestrator | 2025-09-23 20:30:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:23.348208 | orchestrator | 2025-09-23 20:30:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:26.388036 | orchestrator | 2025-09-23 20:30:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:26.389428 | orchestrator | 2025-09-23 20:30:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:26.389483 | orchestrator | 2025-09-23 20:30:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:29.437223 | orchestrator | 2025-09-23 20:30:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:29.438204 | orchestrator | 2025-09-23 20:30:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:29.438315 | orchestrator | 2025-09-23 20:30:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:32.486204 | orchestrator | 2025-09-23 20:30:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:32.486738 | orchestrator | 2025-09-23 20:30:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:32.486783 | orchestrator | 2025-09-23 20:30:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:35.532974 | orchestrator | 2025-09-23 20:30:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:35.536129 | orchestrator | 2025-09-23 20:30:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:35.536187 | orchestrator | 2025-09-23 20:30:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:38.581283 | orchestrator | 2025-09-23 20:30:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:38.582905 | orchestrator | 2025-09-23 20:30:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:38.583170 | orchestrator | 2025-09-23 20:30:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:41.635148 | orchestrator | 2025-09-23 20:30:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:41.637115 | orchestrator | 2025-09-23 20:30:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:41.637192 | orchestrator | 2025-09-23 20:30:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:44.687300 | orchestrator | 2025-09-23 20:30:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:44.688717 | orchestrator | 2025-09-23 20:30:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:44.688800 | orchestrator | 2025-09-23 20:30:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:47.732221 | orchestrator | 2025-09-23 20:30:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:47.733897 | orchestrator | 2025-09-23 20:30:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:47.734058 | orchestrator | 2025-09-23 20:30:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:50.780263 | orchestrator | 2025-09-23 20:30:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:50.782138 | orchestrator | 2025-09-23 20:30:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:50.782258 | orchestrator | 2025-09-23 20:30:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:53.826400 | orchestrator | 2025-09-23 20:30:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:53.828188 | orchestrator | 2025-09-23 20:30:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:53.828308 | orchestrator | 2025-09-23 20:30:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:56.875710 | orchestrator | 2025-09-23 20:30:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:56.877446 | orchestrator | 2025-09-23 20:30:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:56.877504 | orchestrator | 2025-09-23 20:30:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:30:59.921135 | orchestrator | 2025-09-23 20:30:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:30:59.921948 | orchestrator | 2025-09-23 20:30:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:30:59.922092 | orchestrator | 2025-09-23 20:30:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:02.968676 | orchestrator | 2025-09-23 20:31:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:02.970248 | orchestrator | 2025-09-23 20:31:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:02.970288 | orchestrator | 2025-09-23 20:31:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:06.019734 | orchestrator | 2025-09-23 20:31:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:06.020659 | orchestrator | 2025-09-23 20:31:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:06.020706 | orchestrator | 2025-09-23 20:31:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:09.064641 | orchestrator | 2025-09-23 20:31:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:09.065215 | orchestrator | 2025-09-23 20:31:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:09.065260 | orchestrator | 2025-09-23 20:31:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:12.110777 | orchestrator | 2025-09-23 20:31:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:12.112314 | orchestrator | 2025-09-23 20:31:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:12.112357 | orchestrator | 2025-09-23 20:31:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:15.154822 | orchestrator | 2025-09-23 20:31:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:15.156950 | orchestrator | 2025-09-23 20:31:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:15.157087 | orchestrator | 2025-09-23 20:31:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:18.204893 | orchestrator | 2025-09-23 20:31:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:18.206740 | orchestrator | 2025-09-23 20:31:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:18.206880 | orchestrator | 2025-09-23 20:31:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:21.250976 | orchestrator | 2025-09-23 20:31:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:21.251304 | orchestrator | 2025-09-23 20:31:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:21.251337 | orchestrator | 2025-09-23 20:31:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:24.298876 | orchestrator | 2025-09-23 20:31:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:24.300950 | orchestrator | 2025-09-23 20:31:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:24.301017 | orchestrator | 2025-09-23 20:31:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:27.343260 | orchestrator | 2025-09-23 20:31:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:27.345939 | orchestrator | 2025-09-23 20:31:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:27.346002 | orchestrator | 2025-09-23 20:31:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:30.393784 | orchestrator | 2025-09-23 20:31:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:30.394828 | orchestrator | 2025-09-23 20:31:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:30.395171 | orchestrator | 2025-09-23 20:31:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:33.437192 | orchestrator | 2025-09-23 20:31:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:33.439568 | orchestrator | 2025-09-23 20:31:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:33.439622 | orchestrator | 2025-09-23 20:31:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:36.487378 | orchestrator | 2025-09-23 20:31:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:36.487541 | orchestrator | 2025-09-23 20:31:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:36.487559 | orchestrator | 2025-09-23 20:31:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:39.537810 | orchestrator | 2025-09-23 20:31:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:39.539237 | orchestrator | 2025-09-23 20:31:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:39.539346 | orchestrator | 2025-09-23 20:31:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:42.585563 | orchestrator | 2025-09-23 20:31:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:42.587251 | orchestrator | 2025-09-23 20:31:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:42.587304 | orchestrator | 2025-09-23 20:31:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:45.631571 | orchestrator | 2025-09-23 20:31:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:45.632218 | orchestrator | 2025-09-23 20:31:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:45.632251 | orchestrator | 2025-09-23 20:31:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:48.678885 | orchestrator | 2025-09-23 20:31:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:48.681228 | orchestrator | 2025-09-23 20:31:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:48.681309 | orchestrator | 2025-09-23 20:31:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:51.727505 | orchestrator | 2025-09-23 20:31:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:51.728795 | orchestrator | 2025-09-23 20:31:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:51.729063 | orchestrator | 2025-09-23 20:31:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:54.775600 | orchestrator | 2025-09-23 20:31:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:54.776907 | orchestrator | 2025-09-23 20:31:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:54.776958 | orchestrator | 2025-09-23 20:31:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:31:57.824773 | orchestrator | 2025-09-23 20:31:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:31:57.825100 | orchestrator | 2025-09-23 20:31:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:31:57.825217 | orchestrator | 2025-09-23 20:31:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:00.867814 | orchestrator | 2025-09-23 20:32:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:00.868835 | orchestrator | 2025-09-23 20:32:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:00.868876 | orchestrator | 2025-09-23 20:32:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:03.913985 | orchestrator | 2025-09-23 20:32:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:03.916067 | orchestrator | 2025-09-23 20:32:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:03.916396 | orchestrator | 2025-09-23 20:32:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:06.962766 | orchestrator | 2025-09-23 20:32:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:06.964649 | orchestrator | 2025-09-23 20:32:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:06.964728 | orchestrator | 2025-09-23 20:32:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:10.011018 | orchestrator | 2025-09-23 20:32:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:10.013556 | orchestrator | 2025-09-23 20:32:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:10.013598 | orchestrator | 2025-09-23 20:32:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:13.059805 | orchestrator | 2025-09-23 20:32:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:13.061226 | orchestrator | 2025-09-23 20:32:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:13.061304 | orchestrator | 2025-09-23 20:32:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:16.104516 | orchestrator | 2025-09-23 20:32:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:16.106288 | orchestrator | 2025-09-23 20:32:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:16.106373 | orchestrator | 2025-09-23 20:32:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:19.147123 | orchestrator | 2025-09-23 20:32:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:19.149267 | orchestrator | 2025-09-23 20:32:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:19.149314 | orchestrator | 2025-09-23 20:32:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:22.190265 | orchestrator | 2025-09-23 20:32:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:22.191800 | orchestrator | 2025-09-23 20:32:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:22.191836 | orchestrator | 2025-09-23 20:32:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:25.238148 | orchestrator | 2025-09-23 20:32:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:25.239591 | orchestrator | 2025-09-23 20:32:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:25.239624 | orchestrator | 2025-09-23 20:32:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:28.287000 | orchestrator | 2025-09-23 20:32:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:28.289773 | orchestrator | 2025-09-23 20:32:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:28.289863 | orchestrator | 2025-09-23 20:32:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:31.327330 | orchestrator | 2025-09-23 20:32:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:31.330264 | orchestrator | 2025-09-23 20:32:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:31.330313 | orchestrator | 2025-09-23 20:32:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:34.376867 | orchestrator | 2025-09-23 20:32:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:34.378195 | orchestrator | 2025-09-23 20:32:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:34.378231 | orchestrator | 2025-09-23 20:32:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:37.422357 | orchestrator | 2025-09-23 20:32:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:37.423693 | orchestrator | 2025-09-23 20:32:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:37.423774 | orchestrator | 2025-09-23 20:32:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:40.472112 | orchestrator | 2025-09-23 20:32:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:40.473727 | orchestrator | 2025-09-23 20:32:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:40.473783 | orchestrator | 2025-09-23 20:32:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:43.523345 | orchestrator | 2025-09-23 20:32:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:43.524722 | orchestrator | 2025-09-23 20:32:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:43.524756 | orchestrator | 2025-09-23 20:32:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:46.566238 | orchestrator | 2025-09-23 20:32:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:46.567847 | orchestrator | 2025-09-23 20:32:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:46.567982 | orchestrator | 2025-09-23 20:32:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:49.610313 | orchestrator | 2025-09-23 20:32:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:49.612165 | orchestrator | 2025-09-23 20:32:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:49.612200 | orchestrator | 2025-09-23 20:32:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:52.653570 | orchestrator | 2025-09-23 20:32:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:52.655743 | orchestrator | 2025-09-23 20:32:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:52.655773 | orchestrator | 2025-09-23 20:32:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:55.704510 | orchestrator | 2025-09-23 20:32:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:55.706470 | orchestrator | 2025-09-23 20:32:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:55.706495 | orchestrator | 2025-09-23 20:32:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:32:58.754979 | orchestrator | 2025-09-23 20:32:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:32:58.756106 | orchestrator | 2025-09-23 20:32:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:32:58.756161 | orchestrator | 2025-09-23 20:32:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:01.799907 | orchestrator | 2025-09-23 20:33:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:01.802212 | orchestrator | 2025-09-23 20:33:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:01.802246 | orchestrator | 2025-09-23 20:33:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:04.842154 | orchestrator | 2025-09-23 20:33:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:04.843932 | orchestrator | 2025-09-23 20:33:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:04.844007 | orchestrator | 2025-09-23 20:33:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:07.889821 | orchestrator | 2025-09-23 20:33:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:07.891823 | orchestrator | 2025-09-23 20:33:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:07.891880 | orchestrator | 2025-09-23 20:33:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:10.941867 | orchestrator | 2025-09-23 20:33:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:10.944178 | orchestrator | 2025-09-23 20:33:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:10.944212 | orchestrator | 2025-09-23 20:33:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:13.988033 | orchestrator | 2025-09-23 20:33:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:13.990703 | orchestrator | 2025-09-23 20:33:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:13.990739 | orchestrator | 2025-09-23 20:33:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:17.039625 | orchestrator | 2025-09-23 20:33:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:17.042817 | orchestrator | 2025-09-23 20:33:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:17.042855 | orchestrator | 2025-09-23 20:33:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:20.089712 | orchestrator | 2025-09-23 20:33:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:20.090952 | orchestrator | 2025-09-23 20:33:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:20.091008 | orchestrator | 2025-09-23 20:33:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:23.138127 | orchestrator | 2025-09-23 20:33:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:23.140130 | orchestrator | 2025-09-23 20:33:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:23.140202 | orchestrator | 2025-09-23 20:33:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:26.185574 | orchestrator | 2025-09-23 20:33:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:26.186805 | orchestrator | 2025-09-23 20:33:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:26.186846 | orchestrator | 2025-09-23 20:33:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:29.232915 | orchestrator | 2025-09-23 20:33:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:29.235140 | orchestrator | 2025-09-23 20:33:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:29.235203 | orchestrator | 2025-09-23 20:33:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:32.283019 | orchestrator | 2025-09-23 20:33:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:32.284630 | orchestrator | 2025-09-23 20:33:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:32.285079 | orchestrator | 2025-09-23 20:33:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:35.328234 | orchestrator | 2025-09-23 20:33:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:35.330120 | orchestrator | 2025-09-23 20:33:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:35.330233 | orchestrator | 2025-09-23 20:33:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:38.377059 | orchestrator | 2025-09-23 20:33:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:38.378819 | orchestrator | 2025-09-23 20:33:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:38.378870 | orchestrator | 2025-09-23 20:33:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:41.423607 | orchestrator | 2025-09-23 20:33:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:41.425934 | orchestrator | 2025-09-23 20:33:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:41.426012 | orchestrator | 2025-09-23 20:33:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:44.473250 | orchestrator | 2025-09-23 20:33:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:44.475654 | orchestrator | 2025-09-23 20:33:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:44.475696 | orchestrator | 2025-09-23 20:33:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:47.524223 | orchestrator | 2025-09-23 20:33:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:47.526372 | orchestrator | 2025-09-23 20:33:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:47.526455 | orchestrator | 2025-09-23 20:33:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:50.572001 | orchestrator | 2025-09-23 20:33:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:50.574358 | orchestrator | 2025-09-23 20:33:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:50.574414 | orchestrator | 2025-09-23 20:33:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:53.619490 | orchestrator | 2025-09-23 20:33:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:53.622163 | orchestrator | 2025-09-23 20:33:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:53.622305 | orchestrator | 2025-09-23 20:33:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:56.665388 | orchestrator | 2025-09-23 20:33:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:56.667185 | orchestrator | 2025-09-23 20:33:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:56.667237 | orchestrator | 2025-09-23 20:33:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:33:59.713618 | orchestrator | 2025-09-23 20:33:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:33:59.715231 | orchestrator | 2025-09-23 20:33:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:33:59.715672 | orchestrator | 2025-09-23 20:33:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:02.758237 | orchestrator | 2025-09-23 20:34:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:02.759679 | orchestrator | 2025-09-23 20:34:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:02.759768 | orchestrator | 2025-09-23 20:34:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:05.802255 | orchestrator | 2025-09-23 20:34:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:05.803141 | orchestrator | 2025-09-23 20:34:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:05.803336 | orchestrator | 2025-09-23 20:34:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:08.849018 | orchestrator | 2025-09-23 20:34:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:08.854000 | orchestrator | 2025-09-23 20:34:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:08.854085 | orchestrator | 2025-09-23 20:34:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:11.898835 | orchestrator | 2025-09-23 20:34:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:11.900242 | orchestrator | 2025-09-23 20:34:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:11.900272 | orchestrator | 2025-09-23 20:34:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:14.945064 | orchestrator | 2025-09-23 20:34:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:14.946763 | orchestrator | 2025-09-23 20:34:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:14.946810 | orchestrator | 2025-09-23 20:34:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:17.992990 | orchestrator | 2025-09-23 20:34:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:17.994290 | orchestrator | 2025-09-23 20:34:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:17.994327 | orchestrator | 2025-09-23 20:34:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:21.036585 | orchestrator | 2025-09-23 20:34:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:21.037695 | orchestrator | 2025-09-23 20:34:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:21.037886 | orchestrator | 2025-09-23 20:34:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:24.082169 | orchestrator | 2025-09-23 20:34:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:24.082402 | orchestrator | 2025-09-23 20:34:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:24.082425 | orchestrator | 2025-09-23 20:34:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:27.121923 | orchestrator | 2025-09-23 20:34:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:27.123879 | orchestrator | 2025-09-23 20:34:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:27.123916 | orchestrator | 2025-09-23 20:34:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:30.167206 | orchestrator | 2025-09-23 20:34:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:30.168676 | orchestrator | 2025-09-23 20:34:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:30.168745 | orchestrator | 2025-09-23 20:34:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:33.222120 | orchestrator | 2025-09-23 20:34:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:33.223215 | orchestrator | 2025-09-23 20:34:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:33.223297 | orchestrator | 2025-09-23 20:34:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:36.265638 | orchestrator | 2025-09-23 20:34:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:36.267677 | orchestrator | 2025-09-23 20:34:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:36.267709 | orchestrator | 2025-09-23 20:34:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:39.314420 | orchestrator | 2025-09-23 20:34:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:39.316767 | orchestrator | 2025-09-23 20:34:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:39.316973 | orchestrator | 2025-09-23 20:34:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:42.358288 | orchestrator | 2025-09-23 20:34:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:42.360087 | orchestrator | 2025-09-23 20:34:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:42.360119 | orchestrator | 2025-09-23 20:34:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:45.399039 | orchestrator | 2025-09-23 20:34:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:45.400388 | orchestrator | 2025-09-23 20:34:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:45.400439 | orchestrator | 2025-09-23 20:34:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:48.444483 | orchestrator | 2025-09-23 20:34:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:48.445148 | orchestrator | 2025-09-23 20:34:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:48.445420 | orchestrator | 2025-09-23 20:34:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:51.485147 | orchestrator | 2025-09-23 20:34:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:51.485676 | orchestrator | 2025-09-23 20:34:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:51.485997 | orchestrator | 2025-09-23 20:34:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:54.530213 | orchestrator | 2025-09-23 20:34:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:54.531659 | orchestrator | 2025-09-23 20:34:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:54.531691 | orchestrator | 2025-09-23 20:34:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:34:57.577020 | orchestrator | 2025-09-23 20:34:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:34:57.578236 | orchestrator | 2025-09-23 20:34:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:34:57.578272 | orchestrator | 2025-09-23 20:34:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:00.624322 | orchestrator | 2025-09-23 20:35:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:00.625547 | orchestrator | 2025-09-23 20:35:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:00.625661 | orchestrator | 2025-09-23 20:35:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:03.672392 | orchestrator | 2025-09-23 20:35:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:03.673173 | orchestrator | 2025-09-23 20:35:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:03.673707 | orchestrator | 2025-09-23 20:35:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:06.718980 | orchestrator | 2025-09-23 20:35:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:06.720334 | orchestrator | 2025-09-23 20:35:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:06.720368 | orchestrator | 2025-09-23 20:35:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:09.761946 | orchestrator | 2025-09-23 20:35:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:09.762080 | orchestrator | 2025-09-23 20:35:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:09.762097 | orchestrator | 2025-09-23 20:35:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:12.802814 | orchestrator | 2025-09-23 20:35:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:12.804551 | orchestrator | 2025-09-23 20:35:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:12.804609 | orchestrator | 2025-09-23 20:35:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:15.850109 | orchestrator | 2025-09-23 20:35:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:15.851550 | orchestrator | 2025-09-23 20:35:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:15.851730 | orchestrator | 2025-09-23 20:35:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:18.896500 | orchestrator | 2025-09-23 20:35:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:18.898344 | orchestrator | 2025-09-23 20:35:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:18.898382 | orchestrator | 2025-09-23 20:35:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:21.938072 | orchestrator | 2025-09-23 20:35:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:21.938262 | orchestrator | 2025-09-23 20:35:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:21.938285 | orchestrator | 2025-09-23 20:35:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:24.982969 | orchestrator | 2025-09-23 20:35:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:24.986829 | orchestrator | 2025-09-23 20:35:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:24.986896 | orchestrator | 2025-09-23 20:35:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:28.042360 | orchestrator | 2025-09-23 20:35:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:28.046830 | orchestrator | 2025-09-23 20:35:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:28.046900 | orchestrator | 2025-09-23 20:35:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:31.091388 | orchestrator | 2025-09-23 20:35:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:31.092735 | orchestrator | 2025-09-23 20:35:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:31.092773 | orchestrator | 2025-09-23 20:35:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:34.141119 | orchestrator | 2025-09-23 20:35:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:34.142815 | orchestrator | 2025-09-23 20:35:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:34.142876 | orchestrator | 2025-09-23 20:35:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:37.191066 | orchestrator | 2025-09-23 20:35:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:37.195078 | orchestrator | 2025-09-23 20:35:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:37.195247 | orchestrator | 2025-09-23 20:35:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:40.239159 | orchestrator | 2025-09-23 20:35:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:40.240157 | orchestrator | 2025-09-23 20:35:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:40.240527 | orchestrator | 2025-09-23 20:35:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:43.281757 | orchestrator | 2025-09-23 20:35:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:43.283350 | orchestrator | 2025-09-23 20:35:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:43.283437 | orchestrator | 2025-09-23 20:35:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:46.328081 | orchestrator | 2025-09-23 20:35:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:46.330366 | orchestrator | 2025-09-23 20:35:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:46.330407 | orchestrator | 2025-09-23 20:35:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:49.370579 | orchestrator | 2025-09-23 20:35:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:49.371917 | orchestrator | 2025-09-23 20:35:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:49.371964 | orchestrator | 2025-09-23 20:35:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:52.417052 | orchestrator | 2025-09-23 20:35:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:52.419562 | orchestrator | 2025-09-23 20:35:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:52.419599 | orchestrator | 2025-09-23 20:35:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:55.467751 | orchestrator | 2025-09-23 20:35:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:55.469064 | orchestrator | 2025-09-23 20:35:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:55.469196 | orchestrator | 2025-09-23 20:35:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:35:58.516192 | orchestrator | 2025-09-23 20:35:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:35:58.517399 | orchestrator | 2025-09-23 20:35:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:35:58.517436 | orchestrator | 2025-09-23 20:35:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:01.562231 | orchestrator | 2025-09-23 20:36:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:01.564378 | orchestrator | 2025-09-23 20:36:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:01.564479 | orchestrator | 2025-09-23 20:36:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:04.610723 | orchestrator | 2025-09-23 20:36:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:04.612802 | orchestrator | 2025-09-23 20:36:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:04.612994 | orchestrator | 2025-09-23 20:36:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:07.660288 | orchestrator | 2025-09-23 20:36:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:07.662776 | orchestrator | 2025-09-23 20:36:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:07.662819 | orchestrator | 2025-09-23 20:36:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:10.708114 | orchestrator | 2025-09-23 20:36:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:10.709728 | orchestrator | 2025-09-23 20:36:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:10.709775 | orchestrator | 2025-09-23 20:36:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:13.755348 | orchestrator | 2025-09-23 20:36:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:13.756312 | orchestrator | 2025-09-23 20:36:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:13.756343 | orchestrator | 2025-09-23 20:36:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:16.798664 | orchestrator | 2025-09-23 20:36:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:16.799975 | orchestrator | 2025-09-23 20:36:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:16.800119 | orchestrator | 2025-09-23 20:36:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:19.840680 | orchestrator | 2025-09-23 20:36:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:19.841411 | orchestrator | 2025-09-23 20:36:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:19.841717 | orchestrator | 2025-09-23 20:36:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:22.885126 | orchestrator | 2025-09-23 20:36:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:22.886465 | orchestrator | 2025-09-23 20:36:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:22.886507 | orchestrator | 2025-09-23 20:36:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:25.933855 | orchestrator | 2025-09-23 20:36:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:25.934455 | orchestrator | 2025-09-23 20:36:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:25.934502 | orchestrator | 2025-09-23 20:36:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:28.981971 | orchestrator | 2025-09-23 20:36:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:28.985123 | orchestrator | 2025-09-23 20:36:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:28.985206 | orchestrator | 2025-09-23 20:36:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:32.030517 | orchestrator | 2025-09-23 20:36:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:32.031910 | orchestrator | 2025-09-23 20:36:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:32.031946 | orchestrator | 2025-09-23 20:36:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:35.077126 | orchestrator | 2025-09-23 20:36:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:35.077842 | orchestrator | 2025-09-23 20:36:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:35.077877 | orchestrator | 2025-09-23 20:36:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:38.126749 | orchestrator | 2025-09-23 20:36:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:38.129390 | orchestrator | 2025-09-23 20:36:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:38.129431 | orchestrator | 2025-09-23 20:36:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:41.170677 | orchestrator | 2025-09-23 20:36:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:41.171779 | orchestrator | 2025-09-23 20:36:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:41.171827 | orchestrator | 2025-09-23 20:36:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:44.216684 | orchestrator | 2025-09-23 20:36:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:44.218563 | orchestrator | 2025-09-23 20:36:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:44.218597 | orchestrator | 2025-09-23 20:36:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:47.260449 | orchestrator | 2025-09-23 20:36:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:47.261261 | orchestrator | 2025-09-23 20:36:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:47.261285 | orchestrator | 2025-09-23 20:36:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:50.303695 | orchestrator | 2025-09-23 20:36:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:50.304961 | orchestrator | 2025-09-23 20:36:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:50.304995 | orchestrator | 2025-09-23 20:36:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:53.352292 | orchestrator | 2025-09-23 20:36:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:53.352507 | orchestrator | 2025-09-23 20:36:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:53.352532 | orchestrator | 2025-09-23 20:36:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:56.396353 | orchestrator | 2025-09-23 20:36:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:56.397871 | orchestrator | 2025-09-23 20:36:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:56.397930 | orchestrator | 2025-09-23 20:36:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:36:59.445126 | orchestrator | 2025-09-23 20:36:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:36:59.447534 | orchestrator | 2025-09-23 20:36:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:36:59.447582 | orchestrator | 2025-09-23 20:36:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:02.493242 | orchestrator | 2025-09-23 20:37:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:02.494291 | orchestrator | 2025-09-23 20:37:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:02.494375 | orchestrator | 2025-09-23 20:37:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:05.539205 | orchestrator | 2025-09-23 20:37:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:05.541221 | orchestrator | 2025-09-23 20:37:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:05.541260 | orchestrator | 2025-09-23 20:37:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:08.581446 | orchestrator | 2025-09-23 20:37:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:08.582695 | orchestrator | 2025-09-23 20:37:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:08.582839 | orchestrator | 2025-09-23 20:37:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:11.630549 | orchestrator | 2025-09-23 20:37:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:11.631537 | orchestrator | 2025-09-23 20:37:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:11.631653 | orchestrator | 2025-09-23 20:37:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:14.679067 | orchestrator | 2025-09-23 20:37:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:14.682154 | orchestrator | 2025-09-23 20:37:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:14.682249 | orchestrator | 2025-09-23 20:37:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:17.729851 | orchestrator | 2025-09-23 20:37:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:17.731577 | orchestrator | 2025-09-23 20:37:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:17.731787 | orchestrator | 2025-09-23 20:37:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:20.776828 | orchestrator | 2025-09-23 20:37:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:20.777949 | orchestrator | 2025-09-23 20:37:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:20.778161 | orchestrator | 2025-09-23 20:37:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:23.826262 | orchestrator | 2025-09-23 20:37:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:23.826829 | orchestrator | 2025-09-23 20:37:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:23.826863 | orchestrator | 2025-09-23 20:37:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:26.877067 | orchestrator | 2025-09-23 20:37:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:26.878671 | orchestrator | 2025-09-23 20:37:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:26.878781 | orchestrator | 2025-09-23 20:37:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:29.922899 | orchestrator | 2025-09-23 20:37:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:29.924243 | orchestrator | 2025-09-23 20:37:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:29.924771 | orchestrator | 2025-09-23 20:37:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:32.967436 | orchestrator | 2025-09-23 20:37:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:32.968919 | orchestrator | 2025-09-23 20:37:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:32.968959 | orchestrator | 2025-09-23 20:37:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:36.015223 | orchestrator | 2025-09-23 20:37:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:36.016900 | orchestrator | 2025-09-23 20:37:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:36.016993 | orchestrator | 2025-09-23 20:37:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:39.063418 | orchestrator | 2025-09-23 20:37:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:39.065186 | orchestrator | 2025-09-23 20:37:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:39.065223 | orchestrator | 2025-09-23 20:37:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:42.109252 | orchestrator | 2025-09-23 20:37:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:42.111217 | orchestrator | 2025-09-23 20:37:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:42.111479 | orchestrator | 2025-09-23 20:37:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:45.158399 | orchestrator | 2025-09-23 20:37:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:45.159874 | orchestrator | 2025-09-23 20:37:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:45.160237 | orchestrator | 2025-09-23 20:37:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:48.198994 | orchestrator | 2025-09-23 20:37:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:48.201158 | orchestrator | 2025-09-23 20:37:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:48.201535 | orchestrator | 2025-09-23 20:37:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:51.250441 | orchestrator | 2025-09-23 20:37:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:51.252843 | orchestrator | 2025-09-23 20:37:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:51.252959 | orchestrator | 2025-09-23 20:37:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:54.293103 | orchestrator | 2025-09-23 20:37:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:54.295360 | orchestrator | 2025-09-23 20:37:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:54.295940 | orchestrator | 2025-09-23 20:37:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:37:57.336572 | orchestrator | 2025-09-23 20:37:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:37:57.338857 | orchestrator | 2025-09-23 20:37:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:37:57.338905 | orchestrator | 2025-09-23 20:37:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:00.384657 | orchestrator | 2025-09-23 20:38:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:00.385853 | orchestrator | 2025-09-23 20:38:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:00.385892 | orchestrator | 2025-09-23 20:38:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:03.430226 | orchestrator | 2025-09-23 20:38:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:03.431874 | orchestrator | 2025-09-23 20:38:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:03.432258 | orchestrator | 2025-09-23 20:38:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:06.473194 | orchestrator | 2025-09-23 20:38:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:06.475076 | orchestrator | 2025-09-23 20:38:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:06.475168 | orchestrator | 2025-09-23 20:38:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:09.522289 | orchestrator | 2025-09-23 20:38:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:09.523889 | orchestrator | 2025-09-23 20:38:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:09.523950 | orchestrator | 2025-09-23 20:38:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:12.572703 | orchestrator | 2025-09-23 20:38:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:12.574219 | orchestrator | 2025-09-23 20:38:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:12.574253 | orchestrator | 2025-09-23 20:38:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:15.619510 | orchestrator | 2025-09-23 20:38:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:15.621671 | orchestrator | 2025-09-23 20:38:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:15.622231 | orchestrator | 2025-09-23 20:38:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:18.666657 | orchestrator | 2025-09-23 20:38:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:18.668108 | orchestrator | 2025-09-23 20:38:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:18.668137 | orchestrator | 2025-09-23 20:38:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:21.711817 | orchestrator | 2025-09-23 20:38:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:21.713872 | orchestrator | 2025-09-23 20:38:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:21.713905 | orchestrator | 2025-09-23 20:38:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:24.756341 | orchestrator | 2025-09-23 20:38:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:24.759825 | orchestrator | 2025-09-23 20:38:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:24.759907 | orchestrator | 2025-09-23 20:38:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:27.810402 | orchestrator | 2025-09-23 20:38:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:27.812300 | orchestrator | 2025-09-23 20:38:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:27.812329 | orchestrator | 2025-09-23 20:38:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:30.857694 | orchestrator | 2025-09-23 20:38:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:30.859890 | orchestrator | 2025-09-23 20:38:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:30.859966 | orchestrator | 2025-09-23 20:38:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:33.901906 | orchestrator | 2025-09-23 20:38:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:33.902879 | orchestrator | 2025-09-23 20:38:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:33.902929 | orchestrator | 2025-09-23 20:38:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:36.951496 | orchestrator | 2025-09-23 20:38:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:36.952883 | orchestrator | 2025-09-23 20:38:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:36.952916 | orchestrator | 2025-09-23 20:38:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:39.998284 | orchestrator | 2025-09-23 20:38:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:39.999506 | orchestrator | 2025-09-23 20:38:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:39.999900 | orchestrator | 2025-09-23 20:38:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:43.049288 | orchestrator | 2025-09-23 20:38:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:43.051551 | orchestrator | 2025-09-23 20:38:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:43.051614 | orchestrator | 2025-09-23 20:38:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:46.098973 | orchestrator | 2025-09-23 20:38:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:46.101866 | orchestrator | 2025-09-23 20:38:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:46.101914 | orchestrator | 2025-09-23 20:38:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:49.141733 | orchestrator | 2025-09-23 20:38:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:49.144040 | orchestrator | 2025-09-23 20:38:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:49.144421 | orchestrator | 2025-09-23 20:38:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:52.185224 | orchestrator | 2025-09-23 20:38:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:52.185568 | orchestrator | 2025-09-23 20:38:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:52.185863 | orchestrator | 2025-09-23 20:38:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:55.232879 | orchestrator | 2025-09-23 20:38:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:55.235980 | orchestrator | 2025-09-23 20:38:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:55.236033 | orchestrator | 2025-09-23 20:38:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:38:58.286568 | orchestrator | 2025-09-23 20:38:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:38:58.288611 | orchestrator | 2025-09-23 20:38:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:38:58.288661 | orchestrator | 2025-09-23 20:38:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:01.331473 | orchestrator | 2025-09-23 20:39:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:01.333216 | orchestrator | 2025-09-23 20:39:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:01.333250 | orchestrator | 2025-09-23 20:39:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:04.382145 | orchestrator | 2025-09-23 20:39:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:04.383082 | orchestrator | 2025-09-23 20:39:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:04.383114 | orchestrator | 2025-09-23 20:39:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:07.425902 | orchestrator | 2025-09-23 20:39:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:07.427436 | orchestrator | 2025-09-23 20:39:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:07.427627 | orchestrator | 2025-09-23 20:39:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:10.468743 | orchestrator | 2025-09-23 20:39:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:10.470239 | orchestrator | 2025-09-23 20:39:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:10.470329 | orchestrator | 2025-09-23 20:39:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:13.514884 | orchestrator | 2025-09-23 20:39:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:13.516924 | orchestrator | 2025-09-23 20:39:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:13.516941 | orchestrator | 2025-09-23 20:39:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:16.558873 | orchestrator | 2025-09-23 20:39:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:16.561190 | orchestrator | 2025-09-23 20:39:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:16.561279 | orchestrator | 2025-09-23 20:39:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:19.607845 | orchestrator | 2025-09-23 20:39:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:19.609069 | orchestrator | 2025-09-23 20:39:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:19.609168 | orchestrator | 2025-09-23 20:39:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:22.651229 | orchestrator | 2025-09-23 20:39:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:22.652332 | orchestrator | 2025-09-23 20:39:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:22.652363 | orchestrator | 2025-09-23 20:39:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:25.698704 | orchestrator | 2025-09-23 20:39:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:25.700666 | orchestrator | 2025-09-23 20:39:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:25.701089 | orchestrator | 2025-09-23 20:39:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:28.746210 | orchestrator | 2025-09-23 20:39:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:28.747994 | orchestrator | 2025-09-23 20:39:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:28.748080 | orchestrator | 2025-09-23 20:39:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:31.786736 | orchestrator | 2025-09-23 20:39:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:31.788018 | orchestrator | 2025-09-23 20:39:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:31.788372 | orchestrator | 2025-09-23 20:39:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:34.833080 | orchestrator | 2025-09-23 20:39:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:34.833481 | orchestrator | 2025-09-23 20:39:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:34.833511 | orchestrator | 2025-09-23 20:39:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:37.880577 | orchestrator | 2025-09-23 20:39:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:37.883730 | orchestrator | 2025-09-23 20:39:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:37.883831 | orchestrator | 2025-09-23 20:39:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:40.931882 | orchestrator | 2025-09-23 20:39:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:40.932925 | orchestrator | 2025-09-23 20:39:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:40.932958 | orchestrator | 2025-09-23 20:39:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:43.977536 | orchestrator | 2025-09-23 20:39:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:43.979451 | orchestrator | 2025-09-23 20:39:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:43.979601 | orchestrator | 2025-09-23 20:39:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:47.028092 | orchestrator | 2025-09-23 20:39:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:47.029520 | orchestrator | 2025-09-23 20:39:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:47.029554 | orchestrator | 2025-09-23 20:39:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:50.070468 | orchestrator | 2025-09-23 20:39:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:50.071486 | orchestrator | 2025-09-23 20:39:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:50.071673 | orchestrator | 2025-09-23 20:39:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:53.117565 | orchestrator | 2025-09-23 20:39:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:53.118426 | orchestrator | 2025-09-23 20:39:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:53.118502 | orchestrator | 2025-09-23 20:39:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:56.163682 | orchestrator | 2025-09-23 20:39:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:56.165515 | orchestrator | 2025-09-23 20:39:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:56.165845 | orchestrator | 2025-09-23 20:39:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:39:59.210425 | orchestrator | 2025-09-23 20:39:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:39:59.212307 | orchestrator | 2025-09-23 20:39:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:39:59.212447 | orchestrator | 2025-09-23 20:39:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:02.259149 | orchestrator | 2025-09-23 20:40:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:02.261165 | orchestrator | 2025-09-23 20:40:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:02.261202 | orchestrator | 2025-09-23 20:40:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:05.306504 | orchestrator | 2025-09-23 20:40:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:05.308148 | orchestrator | 2025-09-23 20:40:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:05.308245 | orchestrator | 2025-09-23 20:40:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:08.351124 | orchestrator | 2025-09-23 20:40:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:08.352159 | orchestrator | 2025-09-23 20:40:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:08.352198 | orchestrator | 2025-09-23 20:40:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:11.392928 | orchestrator | 2025-09-23 20:40:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:11.394753 | orchestrator | 2025-09-23 20:40:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:11.394804 | orchestrator | 2025-09-23 20:40:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:14.441698 | orchestrator | 2025-09-23 20:40:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:14.443733 | orchestrator | 2025-09-23 20:40:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:14.443854 | orchestrator | 2025-09-23 20:40:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:17.488723 | orchestrator | 2025-09-23 20:40:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:17.489340 | orchestrator | 2025-09-23 20:40:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:17.489436 | orchestrator | 2025-09-23 20:40:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:20.538001 | orchestrator | 2025-09-23 20:40:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:20.539988 | orchestrator | 2025-09-23 20:40:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:20.540521 | orchestrator | 2025-09-23 20:40:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:23.588609 | orchestrator | 2025-09-23 20:40:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:23.590126 | orchestrator | 2025-09-23 20:40:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:23.590165 | orchestrator | 2025-09-23 20:40:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:26.634722 | orchestrator | 2025-09-23 20:40:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:26.637454 | orchestrator | 2025-09-23 20:40:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:26.637886 | orchestrator | 2025-09-23 20:40:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:29.681141 | orchestrator | 2025-09-23 20:40:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:29.682467 | orchestrator | 2025-09-23 20:40:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:29.682547 | orchestrator | 2025-09-23 20:40:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:32.721559 | orchestrator | 2025-09-23 20:40:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:32.722104 | orchestrator | 2025-09-23 20:40:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:32.722137 | orchestrator | 2025-09-23 20:40:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:35.767106 | orchestrator | 2025-09-23 20:40:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:35.769957 | orchestrator | 2025-09-23 20:40:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:35.770155 | orchestrator | 2025-09-23 20:40:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:38.815819 | orchestrator | 2025-09-23 20:40:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:38.817662 | orchestrator | 2025-09-23 20:40:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:38.817696 | orchestrator | 2025-09-23 20:40:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:41.862913 | orchestrator | 2025-09-23 20:40:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:41.865041 | orchestrator | 2025-09-23 20:40:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:41.865093 | orchestrator | 2025-09-23 20:40:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:44.915001 | orchestrator | 2025-09-23 20:40:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:44.916716 | orchestrator | 2025-09-23 20:40:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:44.916752 | orchestrator | 2025-09-23 20:40:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:47.964461 | orchestrator | 2025-09-23 20:40:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:47.966081 | orchestrator | 2025-09-23 20:40:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:47.966184 | orchestrator | 2025-09-23 20:40:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:51.017212 | orchestrator | 2025-09-23 20:40:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:51.017690 | orchestrator | 2025-09-23 20:40:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:51.018286 | orchestrator | 2025-09-23 20:40:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:54.062178 | orchestrator | 2025-09-23 20:40:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:54.063402 | orchestrator | 2025-09-23 20:40:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:54.063440 | orchestrator | 2025-09-23 20:40:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:40:57.110009 | orchestrator | 2025-09-23 20:40:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:40:57.112170 | orchestrator | 2025-09-23 20:40:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:40:57.112202 | orchestrator | 2025-09-23 20:40:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:00.156531 | orchestrator | 2025-09-23 20:41:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:00.158942 | orchestrator | 2025-09-23 20:41:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:00.158995 | orchestrator | 2025-09-23 20:41:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:03.208249 | orchestrator | 2025-09-23 20:41:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:03.211091 | orchestrator | 2025-09-23 20:41:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:03.211298 | orchestrator | 2025-09-23 20:41:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:06.259526 | orchestrator | 2025-09-23 20:41:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:06.262191 | orchestrator | 2025-09-23 20:41:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:06.262440 | orchestrator | 2025-09-23 20:41:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:09.308453 | orchestrator | 2025-09-23 20:41:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:09.309739 | orchestrator | 2025-09-23 20:41:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:09.309772 | orchestrator | 2025-09-23 20:41:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:12.356757 | orchestrator | 2025-09-23 20:41:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:12.358429 | orchestrator | 2025-09-23 20:41:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:12.358466 | orchestrator | 2025-09-23 20:41:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:15.402976 | orchestrator | 2025-09-23 20:41:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:15.404190 | orchestrator | 2025-09-23 20:41:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:15.404216 | orchestrator | 2025-09-23 20:41:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:18.449450 | orchestrator | 2025-09-23 20:41:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:18.451021 | orchestrator | 2025-09-23 20:41:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:18.451140 | orchestrator | 2025-09-23 20:41:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:21.501462 | orchestrator | 2025-09-23 20:41:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:21.503135 | orchestrator | 2025-09-23 20:41:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:21.503252 | orchestrator | 2025-09-23 20:41:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:24.548315 | orchestrator | 2025-09-23 20:41:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:24.549834 | orchestrator | 2025-09-23 20:41:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:24.550004 | orchestrator | 2025-09-23 20:41:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:27.594436 | orchestrator | 2025-09-23 20:41:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:27.595968 | orchestrator | 2025-09-23 20:41:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:27.596071 | orchestrator | 2025-09-23 20:41:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:30.639027 | orchestrator | 2025-09-23 20:41:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:30.641186 | orchestrator | 2025-09-23 20:41:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:30.641266 | orchestrator | 2025-09-23 20:41:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:33.684367 | orchestrator | 2025-09-23 20:41:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:33.688626 | orchestrator | 2025-09-23 20:41:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:33.688726 | orchestrator | 2025-09-23 20:41:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:36.737603 | orchestrator | 2025-09-23 20:41:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:36.740213 | orchestrator | 2025-09-23 20:41:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:36.740260 | orchestrator | 2025-09-23 20:41:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:39.788874 | orchestrator | 2025-09-23 20:41:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:39.790379 | orchestrator | 2025-09-23 20:41:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:39.790416 | orchestrator | 2025-09-23 20:41:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:42.830943 | orchestrator | 2025-09-23 20:41:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:42.831982 | orchestrator | 2025-09-23 20:41:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:42.832666 | orchestrator | 2025-09-23 20:41:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:45.874284 | orchestrator | 2025-09-23 20:41:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:45.876074 | orchestrator | 2025-09-23 20:41:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:45.876149 | orchestrator | 2025-09-23 20:41:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:48.921592 | orchestrator | 2025-09-23 20:41:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:48.923095 | orchestrator | 2025-09-23 20:41:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:48.923130 | orchestrator | 2025-09-23 20:41:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:51.969313 | orchestrator | 2025-09-23 20:41:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:51.971552 | orchestrator | 2025-09-23 20:41:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:51.971585 | orchestrator | 2025-09-23 20:41:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:55.017144 | orchestrator | 2025-09-23 20:41:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:55.018968 | orchestrator | 2025-09-23 20:41:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:55.019004 | orchestrator | 2025-09-23 20:41:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:41:58.062605 | orchestrator | 2025-09-23 20:41:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:41:58.064266 | orchestrator | 2025-09-23 20:41:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:41:58.064347 | orchestrator | 2025-09-23 20:41:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:01.111163 | orchestrator | 2025-09-23 20:42:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:01.112594 | orchestrator | 2025-09-23 20:42:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:01.112626 | orchestrator | 2025-09-23 20:42:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:04.163025 | orchestrator | 2025-09-23 20:42:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:04.165368 | orchestrator | 2025-09-23 20:42:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:04.166123 | orchestrator | 2025-09-23 20:42:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:07.211622 | orchestrator | 2025-09-23 20:42:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:07.214227 | orchestrator | 2025-09-23 20:42:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:07.214248 | orchestrator | 2025-09-23 20:42:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:10.257866 | orchestrator | 2025-09-23 20:42:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:10.259194 | orchestrator | 2025-09-23 20:42:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:10.259228 | orchestrator | 2025-09-23 20:42:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:13.308346 | orchestrator | 2025-09-23 20:42:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:13.310481 | orchestrator | 2025-09-23 20:42:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:13.310672 | orchestrator | 2025-09-23 20:42:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:16.355406 | orchestrator | 2025-09-23 20:42:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:16.357276 | orchestrator | 2025-09-23 20:42:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:16.357309 | orchestrator | 2025-09-23 20:42:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:19.402281 | orchestrator | 2025-09-23 20:42:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:19.403941 | orchestrator | 2025-09-23 20:42:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:19.403987 | orchestrator | 2025-09-23 20:42:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:22.446561 | orchestrator | 2025-09-23 20:42:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:22.448270 | orchestrator | 2025-09-23 20:42:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:22.448317 | orchestrator | 2025-09-23 20:42:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:25.496537 | orchestrator | 2025-09-23 20:42:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:25.497470 | orchestrator | 2025-09-23 20:42:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:25.497502 | orchestrator | 2025-09-23 20:42:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:28.543566 | orchestrator | 2025-09-23 20:42:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:28.546174 | orchestrator | 2025-09-23 20:42:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:28.546219 | orchestrator | 2025-09-23 20:42:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:31.593434 | orchestrator | 2025-09-23 20:42:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:31.595233 | orchestrator | 2025-09-23 20:42:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:31.595303 | orchestrator | 2025-09-23 20:42:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:34.647975 | orchestrator | 2025-09-23 20:42:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:34.649471 | orchestrator | 2025-09-23 20:42:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:34.649550 | orchestrator | 2025-09-23 20:42:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:37.691370 | orchestrator | 2025-09-23 20:42:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:37.693126 | orchestrator | 2025-09-23 20:42:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:37.693150 | orchestrator | 2025-09-23 20:42:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:40.740054 | orchestrator | 2025-09-23 20:42:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:40.742778 | orchestrator | 2025-09-23 20:42:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:40.742965 | orchestrator | 2025-09-23 20:42:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:43.788242 | orchestrator | 2025-09-23 20:42:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:43.789374 | orchestrator | 2025-09-23 20:42:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:43.789566 | orchestrator | 2025-09-23 20:42:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:46.831684 | orchestrator | 2025-09-23 20:42:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:46.833214 | orchestrator | 2025-09-23 20:42:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:46.833319 | orchestrator | 2025-09-23 20:42:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:49.879384 | orchestrator | 2025-09-23 20:42:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:49.881286 | orchestrator | 2025-09-23 20:42:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:49.881403 | orchestrator | 2025-09-23 20:42:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:52.923063 | orchestrator | 2025-09-23 20:42:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:52.924724 | orchestrator | 2025-09-23 20:42:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:52.924763 | orchestrator | 2025-09-23 20:42:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:55.968039 | orchestrator | 2025-09-23 20:42:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:55.969731 | orchestrator | 2025-09-23 20:42:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:55.969783 | orchestrator | 2025-09-23 20:42:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:42:59.011335 | orchestrator | 2025-09-23 20:42:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:42:59.013119 | orchestrator | 2025-09-23 20:42:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:42:59.013234 | orchestrator | 2025-09-23 20:42:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:02.053072 | orchestrator | 2025-09-23 20:43:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:02.056041 | orchestrator | 2025-09-23 20:43:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:02.056154 | orchestrator | 2025-09-23 20:43:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:05.099414 | orchestrator | 2025-09-23 20:43:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:05.101030 | orchestrator | 2025-09-23 20:43:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:05.101061 | orchestrator | 2025-09-23 20:43:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:08.137831 | orchestrator | 2025-09-23 20:43:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:08.139326 | orchestrator | 2025-09-23 20:43:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:08.139733 | orchestrator | 2025-09-23 20:43:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:11.189019 | orchestrator | 2025-09-23 20:43:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:11.189894 | orchestrator | 2025-09-23 20:43:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:11.190070 | orchestrator | 2025-09-23 20:43:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:14.236396 | orchestrator | 2025-09-23 20:43:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:14.238302 | orchestrator | 2025-09-23 20:43:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:14.238424 | orchestrator | 2025-09-23 20:43:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:17.284367 | orchestrator | 2025-09-23 20:43:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:17.285873 | orchestrator | 2025-09-23 20:43:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:17.285919 | orchestrator | 2025-09-23 20:43:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:20.329303 | orchestrator | 2025-09-23 20:43:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:20.331116 | orchestrator | 2025-09-23 20:43:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:20.331238 | orchestrator | 2025-09-23 20:43:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:23.373862 | orchestrator | 2025-09-23 20:43:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:23.374800 | orchestrator | 2025-09-23 20:43:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:23.374831 | orchestrator | 2025-09-23 20:43:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:26.416922 | orchestrator | 2025-09-23 20:43:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:26.418230 | orchestrator | 2025-09-23 20:43:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:26.418293 | orchestrator | 2025-09-23 20:43:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:29.466293 | orchestrator | 2025-09-23 20:43:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:29.468612 | orchestrator | 2025-09-23 20:43:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:29.468689 | orchestrator | 2025-09-23 20:43:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:32.512709 | orchestrator | 2025-09-23 20:43:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:32.514770 | orchestrator | 2025-09-23 20:43:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:32.514807 | orchestrator | 2025-09-23 20:43:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:35.558679 | orchestrator | 2025-09-23 20:43:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:35.561098 | orchestrator | 2025-09-23 20:43:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:35.561133 | orchestrator | 2025-09-23 20:43:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:38.611263 | orchestrator | 2025-09-23 20:43:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:38.612444 | orchestrator | 2025-09-23 20:43:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:38.612688 | orchestrator | 2025-09-23 20:43:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:41.659467 | orchestrator | 2025-09-23 20:43:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:41.661358 | orchestrator | 2025-09-23 20:43:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:41.661391 | orchestrator | 2025-09-23 20:43:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:44.705758 | orchestrator | 2025-09-23 20:43:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:44.707796 | orchestrator | 2025-09-23 20:43:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:44.707831 | orchestrator | 2025-09-23 20:43:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:47.750786 | orchestrator | 2025-09-23 20:43:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:47.752260 | orchestrator | 2025-09-23 20:43:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:47.752456 | orchestrator | 2025-09-23 20:43:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:50.791614 | orchestrator | 2025-09-23 20:43:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:50.793760 | orchestrator | 2025-09-23 20:43:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:50.793795 | orchestrator | 2025-09-23 20:43:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:53.835543 | orchestrator | 2025-09-23 20:43:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:53.838339 | orchestrator | 2025-09-23 20:43:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:53.838612 | orchestrator | 2025-09-23 20:43:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:56.883074 | orchestrator | 2025-09-23 20:43:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:56.884822 | orchestrator | 2025-09-23 20:43:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:56.884887 | orchestrator | 2025-09-23 20:43:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:43:59.925576 | orchestrator | 2025-09-23 20:43:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:43:59.927512 | orchestrator | 2025-09-23 20:43:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:43:59.927651 | orchestrator | 2025-09-23 20:43:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:02.972108 | orchestrator | 2025-09-23 20:44:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:02.973114 | orchestrator | 2025-09-23 20:44:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:02.973148 | orchestrator | 2025-09-23 20:44:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:06.020593 | orchestrator | 2025-09-23 20:44:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:06.021695 | orchestrator | 2025-09-23 20:44:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:06.021825 | orchestrator | 2025-09-23 20:44:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:09.067209 | orchestrator | 2025-09-23 20:44:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:09.069437 | orchestrator | 2025-09-23 20:44:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:09.069544 | orchestrator | 2025-09-23 20:44:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:12.114309 | orchestrator | 2025-09-23 20:44:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:12.115424 | orchestrator | 2025-09-23 20:44:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:12.115518 | orchestrator | 2025-09-23 20:44:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:15.155934 | orchestrator | 2025-09-23 20:44:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:15.158213 | orchestrator | 2025-09-23 20:44:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:15.158452 | orchestrator | 2025-09-23 20:44:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:18.203290 | orchestrator | 2025-09-23 20:44:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:18.204445 | orchestrator | 2025-09-23 20:44:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:18.204486 | orchestrator | 2025-09-23 20:44:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:21.250300 | orchestrator | 2025-09-23 20:44:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:21.251621 | orchestrator | 2025-09-23 20:44:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:21.251672 | orchestrator | 2025-09-23 20:44:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:24.298003 | orchestrator | 2025-09-23 20:44:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:24.300167 | orchestrator | 2025-09-23 20:44:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:24.300224 | orchestrator | 2025-09-23 20:44:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:27.344696 | orchestrator | 2025-09-23 20:44:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:27.346502 | orchestrator | 2025-09-23 20:44:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:27.346578 | orchestrator | 2025-09-23 20:44:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:30.388143 | orchestrator | 2025-09-23 20:44:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:30.389695 | orchestrator | 2025-09-23 20:44:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:30.389773 | orchestrator | 2025-09-23 20:44:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:33.440475 | orchestrator | 2025-09-23 20:44:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:33.442355 | orchestrator | 2025-09-23 20:44:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:33.442454 | orchestrator | 2025-09-23 20:44:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:36.488895 | orchestrator | 2025-09-23 20:44:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:36.492488 | orchestrator | 2025-09-23 20:44:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:36.492632 | orchestrator | 2025-09-23 20:44:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:39.541737 | orchestrator | 2025-09-23 20:44:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:39.544077 | orchestrator | 2025-09-23 20:44:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:39.544120 | orchestrator | 2025-09-23 20:44:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:42.583767 | orchestrator | 2025-09-23 20:44:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:42.584787 | orchestrator | 2025-09-23 20:44:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:42.584872 | orchestrator | 2025-09-23 20:44:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:45.631721 | orchestrator | 2025-09-23 20:44:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:45.632862 | orchestrator | 2025-09-23 20:44:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:45.632899 | orchestrator | 2025-09-23 20:44:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:48.680317 | orchestrator | 2025-09-23 20:44:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:48.681411 | orchestrator | 2025-09-23 20:44:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:48.681535 | orchestrator | 2025-09-23 20:44:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:51.723949 | orchestrator | 2025-09-23 20:44:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:51.725811 | orchestrator | 2025-09-23 20:44:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:51.725986 | orchestrator | 2025-09-23 20:44:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:54.776158 | orchestrator | 2025-09-23 20:44:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:54.776678 | orchestrator | 2025-09-23 20:44:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:54.776940 | orchestrator | 2025-09-23 20:44:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:44:57.824393 | orchestrator | 2025-09-23 20:44:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:44:57.825242 | orchestrator | 2025-09-23 20:44:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:44:57.825294 | orchestrator | 2025-09-23 20:44:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:00.873509 | orchestrator | 2025-09-23 20:45:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:00.875671 | orchestrator | 2025-09-23 20:45:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:00.875706 | orchestrator | 2025-09-23 20:45:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:03.922743 | orchestrator | 2025-09-23 20:45:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:03.924596 | orchestrator | 2025-09-23 20:45:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:03.924665 | orchestrator | 2025-09-23 20:45:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:06.967760 | orchestrator | 2025-09-23 20:45:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:06.969204 | orchestrator | 2025-09-23 20:45:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:06.969237 | orchestrator | 2025-09-23 20:45:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:10.014669 | orchestrator | 2025-09-23 20:45:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:10.015783 | orchestrator | 2025-09-23 20:45:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:10.016000 | orchestrator | 2025-09-23 20:45:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:13.055108 | orchestrator | 2025-09-23 20:45:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:13.056587 | orchestrator | 2025-09-23 20:45:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:13.056712 | orchestrator | 2025-09-23 20:45:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:16.100953 | orchestrator | 2025-09-23 20:45:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:16.102587 | orchestrator | 2025-09-23 20:45:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:16.102640 | orchestrator | 2025-09-23 20:45:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:19.146538 | orchestrator | 2025-09-23 20:45:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:19.148513 | orchestrator | 2025-09-23 20:45:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:19.148539 | orchestrator | 2025-09-23 20:45:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:22.192005 | orchestrator | 2025-09-23 20:45:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:22.194594 | orchestrator | 2025-09-23 20:45:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:22.194645 | orchestrator | 2025-09-23 20:45:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:25.238679 | orchestrator | 2025-09-23 20:45:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:25.242895 | orchestrator | 2025-09-23 20:45:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:25.242948 | orchestrator | 2025-09-23 20:45:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:28.284381 | orchestrator | 2025-09-23 20:45:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:28.286336 | orchestrator | 2025-09-23 20:45:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:28.286371 | orchestrator | 2025-09-23 20:45:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:31.333858 | orchestrator | 2025-09-23 20:45:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:31.336066 | orchestrator | 2025-09-23 20:45:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:31.336201 | orchestrator | 2025-09-23 20:45:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:34.384431 | orchestrator | 2025-09-23 20:45:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:34.386441 | orchestrator | 2025-09-23 20:45:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:34.386516 | orchestrator | 2025-09-23 20:45:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:37.433282 | orchestrator | 2025-09-23 20:45:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:37.434861 | orchestrator | 2025-09-23 20:45:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:37.434944 | orchestrator | 2025-09-23 20:45:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:40.480800 | orchestrator | 2025-09-23 20:45:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:40.482384 | orchestrator | 2025-09-23 20:45:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:40.482513 | orchestrator | 2025-09-23 20:45:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:43.531132 | orchestrator | 2025-09-23 20:45:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:43.533426 | orchestrator | 2025-09-23 20:45:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:43.533545 | orchestrator | 2025-09-23 20:45:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:46.576387 | orchestrator | 2025-09-23 20:45:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:46.577395 | orchestrator | 2025-09-23 20:45:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:46.578318 | orchestrator | 2025-09-23 20:45:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:49.622939 | orchestrator | 2025-09-23 20:45:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:49.623194 | orchestrator | 2025-09-23 20:45:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:49.623900 | orchestrator | 2025-09-23 20:45:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:52.673243 | orchestrator | 2025-09-23 20:45:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:52.674272 | orchestrator | 2025-09-23 20:45:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:52.674318 | orchestrator | 2025-09-23 20:45:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:55.716345 | orchestrator | 2025-09-23 20:45:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:55.717990 | orchestrator | 2025-09-23 20:45:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:55.718088 | orchestrator | 2025-09-23 20:45:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:45:58.764098 | orchestrator | 2025-09-23 20:45:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:45:58.764871 | orchestrator | 2025-09-23 20:45:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:45:58.764900 | orchestrator | 2025-09-23 20:45:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:01.809686 | orchestrator | 2025-09-23 20:46:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:01.811083 | orchestrator | 2025-09-23 20:46:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:01.811137 | orchestrator | 2025-09-23 20:46:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:04.859824 | orchestrator | 2025-09-23 20:46:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:04.861240 | orchestrator | 2025-09-23 20:46:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:04.861418 | orchestrator | 2025-09-23 20:46:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:07.906751 | orchestrator | 2025-09-23 20:46:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:07.908690 | orchestrator | 2025-09-23 20:46:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:07.908804 | orchestrator | 2025-09-23 20:46:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:10.950366 | orchestrator | 2025-09-23 20:46:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:10.952373 | orchestrator | 2025-09-23 20:46:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:10.952405 | orchestrator | 2025-09-23 20:46:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:13.995401 | orchestrator | 2025-09-23 20:46:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:13.997227 | orchestrator | 2025-09-23 20:46:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:13.997311 | orchestrator | 2025-09-23 20:46:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:17.043049 | orchestrator | 2025-09-23 20:46:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:17.044504 | orchestrator | 2025-09-23 20:46:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:17.044534 | orchestrator | 2025-09-23 20:46:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:20.093642 | orchestrator | 2025-09-23 20:46:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:20.096515 | orchestrator | 2025-09-23 20:46:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:20.096629 | orchestrator | 2025-09-23 20:46:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:23.148869 | orchestrator | 2025-09-23 20:46:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:23.151551 | orchestrator | 2025-09-23 20:46:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:23.151684 | orchestrator | 2025-09-23 20:46:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:26.188280 | orchestrator | 2025-09-23 20:46:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:26.189787 | orchestrator | 2025-09-23 20:46:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:26.189816 | orchestrator | 2025-09-23 20:46:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:29.237479 | orchestrator | 2025-09-23 20:46:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:29.238677 | orchestrator | 2025-09-23 20:46:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:29.238712 | orchestrator | 2025-09-23 20:46:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:32.279941 | orchestrator | 2025-09-23 20:46:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:32.281873 | orchestrator | 2025-09-23 20:46:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:32.281902 | orchestrator | 2025-09-23 20:46:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:35.325939 | orchestrator | 2025-09-23 20:46:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:35.328204 | orchestrator | 2025-09-23 20:46:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:35.328595 | orchestrator | 2025-09-23 20:46:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:38.377055 | orchestrator | 2025-09-23 20:46:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:38.378454 | orchestrator | 2025-09-23 20:46:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:38.378485 | orchestrator | 2025-09-23 20:46:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:41.420144 | orchestrator | 2025-09-23 20:46:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:41.421788 | orchestrator | 2025-09-23 20:46:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:41.421817 | orchestrator | 2025-09-23 20:46:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:44.471781 | orchestrator | 2025-09-23 20:46:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:44.474418 | orchestrator | 2025-09-23 20:46:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:44.474449 | orchestrator | 2025-09-23 20:46:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:47.519511 | orchestrator | 2025-09-23 20:46:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:47.520497 | orchestrator | 2025-09-23 20:46:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:47.520529 | orchestrator | 2025-09-23 20:46:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:50.561794 | orchestrator | 2025-09-23 20:46:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:50.563786 | orchestrator | 2025-09-23 20:46:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:50.563832 | orchestrator | 2025-09-23 20:46:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:53.606321 | orchestrator | 2025-09-23 20:46:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:53.607694 | orchestrator | 2025-09-23 20:46:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:53.607765 | orchestrator | 2025-09-23 20:46:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:56.644822 | orchestrator | 2025-09-23 20:46:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:56.646417 | orchestrator | 2025-09-23 20:46:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:56.646466 | orchestrator | 2025-09-23 20:46:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:46:59.696446 | orchestrator | 2025-09-23 20:46:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:46:59.697838 | orchestrator | 2025-09-23 20:46:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:46:59.697866 | orchestrator | 2025-09-23 20:46:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:02.749435 | orchestrator | 2025-09-23 20:47:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:02.750859 | orchestrator | 2025-09-23 20:47:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:02.750907 | orchestrator | 2025-09-23 20:47:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:05.796107 | orchestrator | 2025-09-23 20:47:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:05.798270 | orchestrator | 2025-09-23 20:47:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:05.798414 | orchestrator | 2025-09-23 20:47:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:08.845304 | orchestrator | 2025-09-23 20:47:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:08.846527 | orchestrator | 2025-09-23 20:47:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:08.846733 | orchestrator | 2025-09-23 20:47:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:11.893849 | orchestrator | 2025-09-23 20:47:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:11.895714 | orchestrator | 2025-09-23 20:47:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:11.895918 | orchestrator | 2025-09-23 20:47:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:14.941140 | orchestrator | 2025-09-23 20:47:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:14.942892 | orchestrator | 2025-09-23 20:47:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:14.942925 | orchestrator | 2025-09-23 20:47:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:17.988930 | orchestrator | 2025-09-23 20:47:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:17.990497 | orchestrator | 2025-09-23 20:47:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:17.990533 | orchestrator | 2025-09-23 20:47:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:21.033390 | orchestrator | 2025-09-23 20:47:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:21.034408 | orchestrator | 2025-09-23 20:47:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:21.034438 | orchestrator | 2025-09-23 20:47:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:24.078876 | orchestrator | 2025-09-23 20:47:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:24.080800 | orchestrator | 2025-09-23 20:47:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:24.080829 | orchestrator | 2025-09-23 20:47:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:27.125172 | orchestrator | 2025-09-23 20:47:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:27.126672 | orchestrator | 2025-09-23 20:47:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:27.126717 | orchestrator | 2025-09-23 20:47:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:30.173697 | orchestrator | 2025-09-23 20:47:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:30.174952 | orchestrator | 2025-09-23 20:47:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:30.175175 | orchestrator | 2025-09-23 20:47:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:33.216363 | orchestrator | 2025-09-23 20:47:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:33.218293 | orchestrator | 2025-09-23 20:47:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:33.218328 | orchestrator | 2025-09-23 20:47:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:36.258635 | orchestrator | 2025-09-23 20:47:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:36.259152 | orchestrator | 2025-09-23 20:47:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:36.259179 | orchestrator | 2025-09-23 20:47:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:39.300184 | orchestrator | 2025-09-23 20:47:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:39.301525 | orchestrator | 2025-09-23 20:47:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:39.301556 | orchestrator | 2025-09-23 20:47:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:42.350879 | orchestrator | 2025-09-23 20:47:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:42.353189 | orchestrator | 2025-09-23 20:47:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:42.353327 | orchestrator | 2025-09-23 20:47:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:45.395783 | orchestrator | 2025-09-23 20:47:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:45.397708 | orchestrator | 2025-09-23 20:47:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:45.397730 | orchestrator | 2025-09-23 20:47:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:48.441194 | orchestrator | 2025-09-23 20:47:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:48.442817 | orchestrator | 2025-09-23 20:47:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:48.442849 | orchestrator | 2025-09-23 20:47:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:51.488587 | orchestrator | 2025-09-23 20:47:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:51.491114 | orchestrator | 2025-09-23 20:47:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:51.491960 | orchestrator | 2025-09-23 20:47:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:54.532673 | orchestrator | 2025-09-23 20:47:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:54.533007 | orchestrator | 2025-09-23 20:47:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:54.533037 | orchestrator | 2025-09-23 20:47:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:47:57.576169 | orchestrator | 2025-09-23 20:47:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:47:57.577360 | orchestrator | 2025-09-23 20:47:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:47:57.577399 | orchestrator | 2025-09-23 20:47:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:00.621446 | orchestrator | 2025-09-23 20:48:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:00.623374 | orchestrator | 2025-09-23 20:48:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:00.623431 | orchestrator | 2025-09-23 20:48:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:03.672748 | orchestrator | 2025-09-23 20:48:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:03.675312 | orchestrator | 2025-09-23 20:48:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:03.675346 | orchestrator | 2025-09-23 20:48:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:06.720276 | orchestrator | 2025-09-23 20:48:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:06.722714 | orchestrator | 2025-09-23 20:48:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:06.722800 | orchestrator | 2025-09-23 20:48:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:09.768159 | orchestrator | 2025-09-23 20:48:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:09.770215 | orchestrator | 2025-09-23 20:48:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:09.770355 | orchestrator | 2025-09-23 20:48:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:12.818858 | orchestrator | 2025-09-23 20:48:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:12.820327 | orchestrator | 2025-09-23 20:48:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:12.820356 | orchestrator | 2025-09-23 20:48:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:15.865330 | orchestrator | 2025-09-23 20:48:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:15.866980 | orchestrator | 2025-09-23 20:48:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:15.867028 | orchestrator | 2025-09-23 20:48:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:18.915994 | orchestrator | 2025-09-23 20:48:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:18.918607 | orchestrator | 2025-09-23 20:48:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:18.918642 | orchestrator | 2025-09-23 20:48:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:21.962305 | orchestrator | 2025-09-23 20:48:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:21.963581 | orchestrator | 2025-09-23 20:48:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:21.963656 | orchestrator | 2025-09-23 20:48:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:24.999822 | orchestrator | 2025-09-23 20:48:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:25.001786 | orchestrator | 2025-09-23 20:48:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:25.001828 | orchestrator | 2025-09-23 20:48:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:28.051078 | orchestrator | 2025-09-23 20:48:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:28.053553 | orchestrator | 2025-09-23 20:48:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:28.053618 | orchestrator | 2025-09-23 20:48:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:31.099656 | orchestrator | 2025-09-23 20:48:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:31.102519 | orchestrator | 2025-09-23 20:48:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:31.102620 | orchestrator | 2025-09-23 20:48:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:34.152831 | orchestrator | 2025-09-23 20:48:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:34.155862 | orchestrator | 2025-09-23 20:48:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:34.155938 | orchestrator | 2025-09-23 20:48:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:37.207374 | orchestrator | 2025-09-23 20:48:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:37.209403 | orchestrator | 2025-09-23 20:48:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:37.209435 | orchestrator | 2025-09-23 20:48:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:40.254625 | orchestrator | 2025-09-23 20:48:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:40.255788 | orchestrator | 2025-09-23 20:48:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:40.256140 | orchestrator | 2025-09-23 20:48:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:43.300520 | orchestrator | 2025-09-23 20:48:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:43.302914 | orchestrator | 2025-09-23 20:48:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:43.302989 | orchestrator | 2025-09-23 20:48:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:46.347465 | orchestrator | 2025-09-23 20:48:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:46.348863 | orchestrator | 2025-09-23 20:48:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:46.348890 | orchestrator | 2025-09-23 20:48:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:49.393697 | orchestrator | 2025-09-23 20:48:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:49.395698 | orchestrator | 2025-09-23 20:48:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:49.395749 | orchestrator | 2025-09-23 20:48:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:52.444830 | orchestrator | 2025-09-23 20:48:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:52.445937 | orchestrator | 2025-09-23 20:48:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:52.445972 | orchestrator | 2025-09-23 20:48:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:55.487361 | orchestrator | 2025-09-23 20:48:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:55.488194 | orchestrator | 2025-09-23 20:48:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:55.488217 | orchestrator | 2025-09-23 20:48:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:48:58.529889 | orchestrator | 2025-09-23 20:48:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:48:58.531879 | orchestrator | 2025-09-23 20:48:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:48:58.532278 | orchestrator | 2025-09-23 20:48:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:01.576485 | orchestrator | 2025-09-23 20:49:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:01.577613 | orchestrator | 2025-09-23 20:49:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:01.577642 | orchestrator | 2025-09-23 20:49:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:04.626469 | orchestrator | 2025-09-23 20:49:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:04.627364 | orchestrator | 2025-09-23 20:49:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:04.627400 | orchestrator | 2025-09-23 20:49:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:07.675732 | orchestrator | 2025-09-23 20:49:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:07.676796 | orchestrator | 2025-09-23 20:49:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:07.676839 | orchestrator | 2025-09-23 20:49:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:10.717957 | orchestrator | 2025-09-23 20:49:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:10.719533 | orchestrator | 2025-09-23 20:49:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:10.719622 | orchestrator | 2025-09-23 20:49:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:13.766431 | orchestrator | 2025-09-23 20:49:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:13.767387 | orchestrator | 2025-09-23 20:49:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:13.767422 | orchestrator | 2025-09-23 20:49:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:16.810529 | orchestrator | 2025-09-23 20:49:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:16.812034 | orchestrator | 2025-09-23 20:49:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:16.812470 | orchestrator | 2025-09-23 20:49:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:19.851592 | orchestrator | 2025-09-23 20:49:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:19.852985 | orchestrator | 2025-09-23 20:49:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:19.853062 | orchestrator | 2025-09-23 20:49:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:22.902542 | orchestrator | 2025-09-23 20:49:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:22.903782 | orchestrator | 2025-09-23 20:49:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:22.903914 | orchestrator | 2025-09-23 20:49:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:25.943757 | orchestrator | 2025-09-23 20:49:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:25.944832 | orchestrator | 2025-09-23 20:49:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:25.944877 | orchestrator | 2025-09-23 20:49:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:28.988488 | orchestrator | 2025-09-23 20:49:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:28.989910 | orchestrator | 2025-09-23 20:49:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:28.990287 | orchestrator | 2025-09-23 20:49:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:32.032970 | orchestrator | 2025-09-23 20:49:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:32.034718 | orchestrator | 2025-09-23 20:49:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:32.034754 | orchestrator | 2025-09-23 20:49:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:35.082566 | orchestrator | 2025-09-23 20:49:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:35.083391 | orchestrator | 2025-09-23 20:49:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:35.083424 | orchestrator | 2025-09-23 20:49:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:38.130211 | orchestrator | 2025-09-23 20:49:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:38.131968 | orchestrator | 2025-09-23 20:49:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:38.132053 | orchestrator | 2025-09-23 20:49:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:41.168395 | orchestrator | 2025-09-23 20:49:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:41.169554 | orchestrator | 2025-09-23 20:49:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:41.169586 | orchestrator | 2025-09-23 20:49:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:44.215801 | orchestrator | 2025-09-23 20:49:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:44.217747 | orchestrator | 2025-09-23 20:49:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:44.217780 | orchestrator | 2025-09-23 20:49:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:47.266429 | orchestrator | 2025-09-23 20:49:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:47.267131 | orchestrator | 2025-09-23 20:49:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:47.267153 | orchestrator | 2025-09-23 20:49:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:50.314878 | orchestrator | 2025-09-23 20:49:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:50.315745 | orchestrator | 2025-09-23 20:49:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:50.315959 | orchestrator | 2025-09-23 20:49:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:53.363032 | orchestrator | 2025-09-23 20:49:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:53.364335 | orchestrator | 2025-09-23 20:49:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:53.364547 | orchestrator | 2025-09-23 20:49:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:56.405657 | orchestrator | 2025-09-23 20:49:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:56.408014 | orchestrator | 2025-09-23 20:49:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:56.408092 | orchestrator | 2025-09-23 20:49:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:49:59.445156 | orchestrator | 2025-09-23 20:49:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:49:59.447604 | orchestrator | 2025-09-23 20:49:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:49:59.447641 | orchestrator | 2025-09-23 20:49:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:02.493415 | orchestrator | 2025-09-23 20:50:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:02.495169 | orchestrator | 2025-09-23 20:50:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:02.495487 | orchestrator | 2025-09-23 20:50:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:05.539473 | orchestrator | 2025-09-23 20:50:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:05.541631 | orchestrator | 2025-09-23 20:50:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:05.541903 | orchestrator | 2025-09-23 20:50:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:08.587001 | orchestrator | 2025-09-23 20:50:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:08.588852 | orchestrator | 2025-09-23 20:50:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:08.588983 | orchestrator | 2025-09-23 20:50:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:11.629862 | orchestrator | 2025-09-23 20:50:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:11.631438 | orchestrator | 2025-09-23 20:50:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:11.631574 | orchestrator | 2025-09-23 20:50:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:14.674219 | orchestrator | 2025-09-23 20:50:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:14.675584 | orchestrator | 2025-09-23 20:50:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:14.675629 | orchestrator | 2025-09-23 20:50:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:17.723073 | orchestrator | 2025-09-23 20:50:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:17.725539 | orchestrator | 2025-09-23 20:50:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:17.725572 | orchestrator | 2025-09-23 20:50:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:20.773935 | orchestrator | 2025-09-23 20:50:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:20.774942 | orchestrator | 2025-09-23 20:50:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:20.774973 | orchestrator | 2025-09-23 20:50:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:23.817026 | orchestrator | 2025-09-23 20:50:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:23.819268 | orchestrator | 2025-09-23 20:50:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:23.819692 | orchestrator | 2025-09-23 20:50:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:26.866491 | orchestrator | 2025-09-23 20:50:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:26.868099 | orchestrator | 2025-09-23 20:50:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:26.868124 | orchestrator | 2025-09-23 20:50:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:29.914391 | orchestrator | 2025-09-23 20:50:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:29.916579 | orchestrator | 2025-09-23 20:50:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:29.916638 | orchestrator | 2025-09-23 20:50:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:32.964930 | orchestrator | 2025-09-23 20:50:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:32.966433 | orchestrator | 2025-09-23 20:50:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:32.966479 | orchestrator | 2025-09-23 20:50:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:36.016008 | orchestrator | 2025-09-23 20:50:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:36.017344 | orchestrator | 2025-09-23 20:50:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:36.017447 | orchestrator | 2025-09-23 20:50:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:39.063395 | orchestrator | 2025-09-23 20:50:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:39.064805 | orchestrator | 2025-09-23 20:50:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:39.064838 | orchestrator | 2025-09-23 20:50:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:42.109713 | orchestrator | 2025-09-23 20:50:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:42.111382 | orchestrator | 2025-09-23 20:50:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:42.111485 | orchestrator | 2025-09-23 20:50:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:45.156901 | orchestrator | 2025-09-23 20:50:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:45.158227 | orchestrator | 2025-09-23 20:50:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:45.158520 | orchestrator | 2025-09-23 20:50:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:48.195741 | orchestrator | 2025-09-23 20:50:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:48.196549 | orchestrator | 2025-09-23 20:50:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:48.196613 | orchestrator | 2025-09-23 20:50:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:51.245057 | orchestrator | 2025-09-23 20:50:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:51.246586 | orchestrator | 2025-09-23 20:50:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:51.246840 | orchestrator | 2025-09-23 20:50:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:54.292955 | orchestrator | 2025-09-23 20:50:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:54.296808 | orchestrator | 2025-09-23 20:50:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:54.297030 | orchestrator | 2025-09-23 20:50:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:50:57.347828 | orchestrator | 2025-09-23 20:50:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:50:57.350156 | orchestrator | 2025-09-23 20:50:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:50:57.350194 | orchestrator | 2025-09-23 20:50:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:00.394548 | orchestrator | 2025-09-23 20:51:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:00.396075 | orchestrator | 2025-09-23 20:51:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:00.396228 | orchestrator | 2025-09-23 20:51:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:03.444532 | orchestrator | 2025-09-23 20:51:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:03.445599 | orchestrator | 2025-09-23 20:51:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:03.445634 | orchestrator | 2025-09-23 20:51:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:06.496795 | orchestrator | 2025-09-23 20:51:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:06.498558 | orchestrator | 2025-09-23 20:51:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:06.498592 | orchestrator | 2025-09-23 20:51:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:09.540746 | orchestrator | 2025-09-23 20:51:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:09.542198 | orchestrator | 2025-09-23 20:51:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:09.542227 | orchestrator | 2025-09-23 20:51:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:12.584645 | orchestrator | 2025-09-23 20:51:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:12.585762 | orchestrator | 2025-09-23 20:51:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:12.585871 | orchestrator | 2025-09-23 20:51:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:15.628706 | orchestrator | 2025-09-23 20:51:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:15.630001 | orchestrator | 2025-09-23 20:51:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:15.630085 | orchestrator | 2025-09-23 20:51:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:18.675015 | orchestrator | 2025-09-23 20:51:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:18.677794 | orchestrator | 2025-09-23 20:51:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:18.677913 | orchestrator | 2025-09-23 20:51:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:21.718575 | orchestrator | 2025-09-23 20:51:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:21.720145 | orchestrator | 2025-09-23 20:51:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:21.720287 | orchestrator | 2025-09-23 20:51:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:24.764518 | orchestrator | 2025-09-23 20:51:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:24.766359 | orchestrator | 2025-09-23 20:51:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:24.766545 | orchestrator | 2025-09-23 20:51:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:27.815056 | orchestrator | 2025-09-23 20:51:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:27.817046 | orchestrator | 2025-09-23 20:51:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:27.817093 | orchestrator | 2025-09-23 20:51:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:30.865989 | orchestrator | 2025-09-23 20:51:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:30.867345 | orchestrator | 2025-09-23 20:51:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:30.867434 | orchestrator | 2025-09-23 20:51:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:33.911594 | orchestrator | 2025-09-23 20:51:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:33.912431 | orchestrator | 2025-09-23 20:51:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:33.912501 | orchestrator | 2025-09-23 20:51:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:36.954788 | orchestrator | 2025-09-23 20:51:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:36.956245 | orchestrator | 2025-09-23 20:51:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:36.956425 | orchestrator | 2025-09-23 20:51:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:40.001174 | orchestrator | 2025-09-23 20:51:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:40.004000 | orchestrator | 2025-09-23 20:51:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:40.004385 | orchestrator | 2025-09-23 20:51:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:43.047049 | orchestrator | 2025-09-23 20:51:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:43.047790 | orchestrator | 2025-09-23 20:51:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:43.047821 | orchestrator | 2025-09-23 20:51:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:46.096909 | orchestrator | 2025-09-23 20:51:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:46.098954 | orchestrator | 2025-09-23 20:51:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:46.099075 | orchestrator | 2025-09-23 20:51:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:49.144252 | orchestrator | 2025-09-23 20:51:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:49.146343 | orchestrator | 2025-09-23 20:51:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:49.146387 | orchestrator | 2025-09-23 20:51:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:52.195907 | orchestrator | 2025-09-23 20:51:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:52.197710 | orchestrator | 2025-09-23 20:51:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:52.197751 | orchestrator | 2025-09-23 20:51:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:55.241843 | orchestrator | 2025-09-23 20:51:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:55.243272 | orchestrator | 2025-09-23 20:51:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:55.243367 | orchestrator | 2025-09-23 20:51:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:51:58.290612 | orchestrator | 2025-09-23 20:51:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:51:58.292800 | orchestrator | 2025-09-23 20:51:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:51:58.293238 | orchestrator | 2025-09-23 20:51:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:01.339785 | orchestrator | 2025-09-23 20:52:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:01.341159 | orchestrator | 2025-09-23 20:52:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:01.341192 | orchestrator | 2025-09-23 20:52:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:04.386311 | orchestrator | 2025-09-23 20:52:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:04.387258 | orchestrator | 2025-09-23 20:52:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:04.387302 | orchestrator | 2025-09-23 20:52:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:07.432169 | orchestrator | 2025-09-23 20:52:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:07.433583 | orchestrator | 2025-09-23 20:52:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:07.433621 | orchestrator | 2025-09-23 20:52:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:10.478155 | orchestrator | 2025-09-23 20:52:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:10.479216 | orchestrator | 2025-09-23 20:52:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:10.479533 | orchestrator | 2025-09-23 20:52:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:13.526578 | orchestrator | 2025-09-23 20:52:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:13.528592 | orchestrator | 2025-09-23 20:52:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:13.528625 | orchestrator | 2025-09-23 20:52:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:16.571256 | orchestrator | 2025-09-23 20:52:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:16.572915 | orchestrator | 2025-09-23 20:52:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:16.573032 | orchestrator | 2025-09-23 20:52:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:19.616137 | orchestrator | 2025-09-23 20:52:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:19.617715 | orchestrator | 2025-09-23 20:52:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:19.617756 | orchestrator | 2025-09-23 20:52:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:22.662697 | orchestrator | 2025-09-23 20:52:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:22.664625 | orchestrator | 2025-09-23 20:52:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:22.664659 | orchestrator | 2025-09-23 20:52:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:25.703401 | orchestrator | 2025-09-23 20:52:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:25.703937 | orchestrator | 2025-09-23 20:52:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:25.703969 | orchestrator | 2025-09-23 20:52:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:28.751168 | orchestrator | 2025-09-23 20:52:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:28.753131 | orchestrator | 2025-09-23 20:52:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:28.753289 | orchestrator | 2025-09-23 20:52:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:31.791889 | orchestrator | 2025-09-23 20:52:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:31.793262 | orchestrator | 2025-09-23 20:52:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:31.793384 | orchestrator | 2025-09-23 20:52:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:34.839860 | orchestrator | 2025-09-23 20:52:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:34.841285 | orchestrator | 2025-09-23 20:52:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:34.841312 | orchestrator | 2025-09-23 20:52:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:37.887887 | orchestrator | 2025-09-23 20:52:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:37.889157 | orchestrator | 2025-09-23 20:52:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:37.889192 | orchestrator | 2025-09-23 20:52:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:40.935470 | orchestrator | 2025-09-23 20:52:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:40.937173 | orchestrator | 2025-09-23 20:52:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:40.937204 | orchestrator | 2025-09-23 20:52:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:43.986902 | orchestrator | 2025-09-23 20:52:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:43.987787 | orchestrator | 2025-09-23 20:52:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:43.988041 | orchestrator | 2025-09-23 20:52:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:47.040432 | orchestrator | 2025-09-23 20:52:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:47.041755 | orchestrator | 2025-09-23 20:52:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:47.042365 | orchestrator | 2025-09-23 20:52:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:50.088631 | orchestrator | 2025-09-23 20:52:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:50.090584 | orchestrator | 2025-09-23 20:52:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:50.090635 | orchestrator | 2025-09-23 20:52:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:53.133485 | orchestrator | 2025-09-23 20:52:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:53.134594 | orchestrator | 2025-09-23 20:52:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:53.134646 | orchestrator | 2025-09-23 20:52:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:56.174811 | orchestrator | 2025-09-23 20:52:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:56.176659 | orchestrator | 2025-09-23 20:52:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:56.177235 | orchestrator | 2025-09-23 20:52:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:52:59.220804 | orchestrator | 2025-09-23 20:52:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:52:59.221940 | orchestrator | 2025-09-23 20:52:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:52:59.222089 | orchestrator | 2025-09-23 20:52:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:02.269116 | orchestrator | 2025-09-23 20:53:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:02.271110 | orchestrator | 2025-09-23 20:53:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:02.271228 | orchestrator | 2025-09-23 20:53:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:05.317482 | orchestrator | 2025-09-23 20:53:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:05.318001 | orchestrator | 2025-09-23 20:53:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:05.318112 | orchestrator | 2025-09-23 20:53:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:08.352408 | orchestrator | 2025-09-23 20:53:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:08.353896 | orchestrator | 2025-09-23 20:53:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:08.353933 | orchestrator | 2025-09-23 20:53:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:11.397326 | orchestrator | 2025-09-23 20:53:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:11.399049 | orchestrator | 2025-09-23 20:53:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:11.399099 | orchestrator | 2025-09-23 20:53:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:14.440295 | orchestrator | 2025-09-23 20:53:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:14.441622 | orchestrator | 2025-09-23 20:53:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:14.441882 | orchestrator | 2025-09-23 20:53:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:17.487306 | orchestrator | 2025-09-23 20:53:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:17.488744 | orchestrator | 2025-09-23 20:53:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:17.488819 | orchestrator | 2025-09-23 20:53:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:20.536605 | orchestrator | 2025-09-23 20:53:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:20.537993 | orchestrator | 2025-09-23 20:53:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:20.538058 | orchestrator | 2025-09-23 20:53:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:23.584262 | orchestrator | 2025-09-23 20:53:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:23.587418 | orchestrator | 2025-09-23 20:53:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:23.587627 | orchestrator | 2025-09-23 20:53:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:26.640813 | orchestrator | 2025-09-23 20:53:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:26.641906 | orchestrator | 2025-09-23 20:53:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:26.641999 | orchestrator | 2025-09-23 20:53:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:29.687358 | orchestrator | 2025-09-23 20:53:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:29.690786 | orchestrator | 2025-09-23 20:53:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:29.690840 | orchestrator | 2025-09-23 20:53:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:32.737347 | orchestrator | 2025-09-23 20:53:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:32.738775 | orchestrator | 2025-09-23 20:53:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:32.738813 | orchestrator | 2025-09-23 20:53:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:35.788046 | orchestrator | 2025-09-23 20:53:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:35.789495 | orchestrator | 2025-09-23 20:53:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:35.789545 | orchestrator | 2025-09-23 20:53:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:38.835642 | orchestrator | 2025-09-23 20:53:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:38.835910 | orchestrator | 2025-09-23 20:53:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:38.835939 | orchestrator | 2025-09-23 20:53:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:41.885729 | orchestrator | 2025-09-23 20:53:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:41.886977 | orchestrator | 2025-09-23 20:53:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:41.887178 | orchestrator | 2025-09-23 20:53:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:44.937541 | orchestrator | 2025-09-23 20:53:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:44.937804 | orchestrator | 2025-09-23 20:53:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:44.938147 | orchestrator | 2025-09-23 20:53:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:47.983671 | orchestrator | 2025-09-23 20:53:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:47.985652 | orchestrator | 2025-09-23 20:53:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:47.985711 | orchestrator | 2025-09-23 20:53:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:51.026949 | orchestrator | 2025-09-23 20:53:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:51.027900 | orchestrator | 2025-09-23 20:53:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:51.027934 | orchestrator | 2025-09-23 20:53:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:54.068731 | orchestrator | 2025-09-23 20:53:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:54.070144 | orchestrator | 2025-09-23 20:53:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:54.070194 | orchestrator | 2025-09-23 20:53:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:53:57.106680 | orchestrator | 2025-09-23 20:53:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:53:57.108878 | orchestrator | 2025-09-23 20:53:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:53:57.108902 | orchestrator | 2025-09-23 20:53:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:00.154900 | orchestrator | 2025-09-23 20:54:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:00.156727 | orchestrator | 2025-09-23 20:54:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:00.156762 | orchestrator | 2025-09-23 20:54:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:03.198140 | orchestrator | 2025-09-23 20:54:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:03.199662 | orchestrator | 2025-09-23 20:54:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:03.199746 | orchestrator | 2025-09-23 20:54:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:06.237714 | orchestrator | 2025-09-23 20:54:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:06.239936 | orchestrator | 2025-09-23 20:54:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:06.239971 | orchestrator | 2025-09-23 20:54:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:09.284786 | orchestrator | 2025-09-23 20:54:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:09.286364 | orchestrator | 2025-09-23 20:54:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:09.287731 | orchestrator | 2025-09-23 20:54:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:12.331060 | orchestrator | 2025-09-23 20:54:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:12.332409 | orchestrator | 2025-09-23 20:54:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:12.332460 | orchestrator | 2025-09-23 20:54:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:15.386294 | orchestrator | 2025-09-23 20:54:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:15.387762 | orchestrator | 2025-09-23 20:54:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:15.387830 | orchestrator | 2025-09-23 20:54:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:18.430884 | orchestrator | 2025-09-23 20:54:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:18.431802 | orchestrator | 2025-09-23 20:54:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:18.431835 | orchestrator | 2025-09-23 20:54:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:21.478445 | orchestrator | 2025-09-23 20:54:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:21.480272 | orchestrator | 2025-09-23 20:54:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:21.480307 | orchestrator | 2025-09-23 20:54:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:24.519295 | orchestrator | 2025-09-23 20:54:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:24.520885 | orchestrator | 2025-09-23 20:54:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:24.520920 | orchestrator | 2025-09-23 20:54:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:27.567340 | orchestrator | 2025-09-23 20:54:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:27.568711 | orchestrator | 2025-09-23 20:54:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:27.568756 | orchestrator | 2025-09-23 20:54:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:30.620449 | orchestrator | 2025-09-23 20:54:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:30.621813 | orchestrator | 2025-09-23 20:54:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:30.621848 | orchestrator | 2025-09-23 20:54:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:33.668878 | orchestrator | 2025-09-23 20:54:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:33.670449 | orchestrator | 2025-09-23 20:54:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:33.670490 | orchestrator | 2025-09-23 20:54:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:36.713054 | orchestrator | 2025-09-23 20:54:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:36.715159 | orchestrator | 2025-09-23 20:54:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:36.715285 | orchestrator | 2025-09-23 20:54:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:39.759016 | orchestrator | 2025-09-23 20:54:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:39.759644 | orchestrator | 2025-09-23 20:54:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:39.759678 | orchestrator | 2025-09-23 20:54:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:42.803841 | orchestrator | 2025-09-23 20:54:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:42.805845 | orchestrator | 2025-09-23 20:54:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:42.805879 | orchestrator | 2025-09-23 20:54:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:45.847426 | orchestrator | 2025-09-23 20:54:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:45.848327 | orchestrator | 2025-09-23 20:54:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:45.848399 | orchestrator | 2025-09-23 20:54:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:48.897125 | orchestrator | 2025-09-23 20:54:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:48.898840 | orchestrator | 2025-09-23 20:54:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:48.898874 | orchestrator | 2025-09-23 20:54:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:51.948615 | orchestrator | 2025-09-23 20:54:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:51.949919 | orchestrator | 2025-09-23 20:54:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:51.949955 | orchestrator | 2025-09-23 20:54:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:54.995781 | orchestrator | 2025-09-23 20:54:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:54.998938 | orchestrator | 2025-09-23 20:54:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:54.999041 | orchestrator | 2025-09-23 20:54:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:54:58.056354 | orchestrator | 2025-09-23 20:54:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:54:58.058000 | orchestrator | 2025-09-23 20:54:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:54:58.058087 | orchestrator | 2025-09-23 20:54:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:01.111327 | orchestrator | 2025-09-23 20:55:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:01.112621 | orchestrator | 2025-09-23 20:55:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:01.112801 | orchestrator | 2025-09-23 20:55:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:04.163973 | orchestrator | 2025-09-23 20:55:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:04.165911 | orchestrator | 2025-09-23 20:55:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:04.165963 | orchestrator | 2025-09-23 20:55:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:07.216121 | orchestrator | 2025-09-23 20:55:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:07.217426 | orchestrator | 2025-09-23 20:55:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:07.217679 | orchestrator | 2025-09-23 20:55:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:10.257287 | orchestrator | 2025-09-23 20:55:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:10.259332 | orchestrator | 2025-09-23 20:55:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:10.259365 | orchestrator | 2025-09-23 20:55:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:13.305433 | orchestrator | 2025-09-23 20:55:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:13.307734 | orchestrator | 2025-09-23 20:55:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:13.307775 | orchestrator | 2025-09-23 20:55:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:16.350940 | orchestrator | 2025-09-23 20:55:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:16.352376 | orchestrator | 2025-09-23 20:55:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:16.352416 | orchestrator | 2025-09-23 20:55:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:19.398057 | orchestrator | 2025-09-23 20:55:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:19.399203 | orchestrator | 2025-09-23 20:55:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:19.399234 | orchestrator | 2025-09-23 20:55:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:22.443375 | orchestrator | 2025-09-23 20:55:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:22.445267 | orchestrator | 2025-09-23 20:55:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:22.445392 | orchestrator | 2025-09-23 20:55:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:25.486301 | orchestrator | 2025-09-23 20:55:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:25.487759 | orchestrator | 2025-09-23 20:55:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:25.487964 | orchestrator | 2025-09-23 20:55:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:28.546923 | orchestrator | 2025-09-23 20:55:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:28.549175 | orchestrator | 2025-09-23 20:55:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:28.549209 | orchestrator | 2025-09-23 20:55:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:31.622651 | orchestrator | 2025-09-23 20:55:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:31.624333 | orchestrator | 2025-09-23 20:55:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:31.624366 | orchestrator | 2025-09-23 20:55:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:34.682070 | orchestrator | 2025-09-23 20:55:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:34.684267 | orchestrator | 2025-09-23 20:55:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:34.684310 | orchestrator | 2025-09-23 20:55:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:37.734089 | orchestrator | 2025-09-23 20:55:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:37.736548 | orchestrator | 2025-09-23 20:55:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:37.736811 | orchestrator | 2025-09-23 20:55:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:40.776202 | orchestrator | 2025-09-23 20:55:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:40.777806 | orchestrator | 2025-09-23 20:55:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:40.777831 | orchestrator | 2025-09-23 20:55:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:43.824344 | orchestrator | 2025-09-23 20:55:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:43.825973 | orchestrator | 2025-09-23 20:55:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:43.826203 | orchestrator | 2025-09-23 20:55:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:46.871063 | orchestrator | 2025-09-23 20:55:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:46.874888 | orchestrator | 2025-09-23 20:55:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:46.875119 | orchestrator | 2025-09-23 20:55:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:49.921536 | orchestrator | 2025-09-23 20:55:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:49.922456 | orchestrator | 2025-09-23 20:55:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:49.922586 | orchestrator | 2025-09-23 20:55:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:52.967641 | orchestrator | 2025-09-23 20:55:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:52.969029 | orchestrator | 2025-09-23 20:55:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:52.969172 | orchestrator | 2025-09-23 20:55:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:56.016459 | orchestrator | 2025-09-23 20:55:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:56.018472 | orchestrator | 2025-09-23 20:55:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:56.018866 | orchestrator | 2025-09-23 20:55:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:55:59.071357 | orchestrator | 2025-09-23 20:55:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:55:59.072889 | orchestrator | 2025-09-23 20:55:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:55:59.073086 | orchestrator | 2025-09-23 20:55:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:02.113187 | orchestrator | 2025-09-23 20:56:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:02.114733 | orchestrator | 2025-09-23 20:56:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:02.114842 | orchestrator | 2025-09-23 20:56:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:05.169455 | orchestrator | 2025-09-23 20:56:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:05.171109 | orchestrator | 2025-09-23 20:56:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:05.171204 | orchestrator | 2025-09-23 20:56:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:08.216927 | orchestrator | 2025-09-23 20:56:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:08.218915 | orchestrator | 2025-09-23 20:56:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:08.218950 | orchestrator | 2025-09-23 20:56:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:11.263103 | orchestrator | 2025-09-23 20:56:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:11.265500 | orchestrator | 2025-09-23 20:56:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:11.265540 | orchestrator | 2025-09-23 20:56:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:14.314425 | orchestrator | 2025-09-23 20:56:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:14.316560 | orchestrator | 2025-09-23 20:56:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:14.317720 | orchestrator | 2025-09-23 20:56:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:17.362186 | orchestrator | 2025-09-23 20:56:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:17.363976 | orchestrator | 2025-09-23 20:56:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:17.364031 | orchestrator | 2025-09-23 20:56:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:20.412799 | orchestrator | 2025-09-23 20:56:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:20.414870 | orchestrator | 2025-09-23 20:56:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:20.415049 | orchestrator | 2025-09-23 20:56:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:23.455428 | orchestrator | 2025-09-23 20:56:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:23.457324 | orchestrator | 2025-09-23 20:56:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:23.457359 | orchestrator | 2025-09-23 20:56:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:26.499425 | orchestrator | 2025-09-23 20:56:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:26.500594 | orchestrator | 2025-09-23 20:56:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:26.500669 | orchestrator | 2025-09-23 20:56:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:29.543620 | orchestrator | 2025-09-23 20:56:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:29.545619 | orchestrator | 2025-09-23 20:56:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:29.545657 | orchestrator | 2025-09-23 20:56:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:32.587611 | orchestrator | 2025-09-23 20:56:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:32.589341 | orchestrator | 2025-09-23 20:56:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:32.589378 | orchestrator | 2025-09-23 20:56:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:35.638197 | orchestrator | 2025-09-23 20:56:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:35.639970 | orchestrator | 2025-09-23 20:56:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:35.639989 | orchestrator | 2025-09-23 20:56:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:38.683650 | orchestrator | 2025-09-23 20:56:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:38.684494 | orchestrator | 2025-09-23 20:56:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:38.684531 | orchestrator | 2025-09-23 20:56:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:41.733016 | orchestrator | 2025-09-23 20:56:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:41.735089 | orchestrator | 2025-09-23 20:56:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:41.735125 | orchestrator | 2025-09-23 20:56:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:44.780464 | orchestrator | 2025-09-23 20:56:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:44.783547 | orchestrator | 2025-09-23 20:56:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:44.783601 | orchestrator | 2025-09-23 20:56:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:47.829231 | orchestrator | 2025-09-23 20:56:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:47.830786 | orchestrator | 2025-09-23 20:56:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:47.830849 | orchestrator | 2025-09-23 20:56:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:50.876970 | orchestrator | 2025-09-23 20:56:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:50.878287 | orchestrator | 2025-09-23 20:56:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:50.878372 | orchestrator | 2025-09-23 20:56:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:53.915907 | orchestrator | 2025-09-23 20:56:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:53.918243 | orchestrator | 2025-09-23 20:56:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:53.918296 | orchestrator | 2025-09-23 20:56:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:56:56.969844 | orchestrator | 2025-09-23 20:56:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:56:56.971655 | orchestrator | 2025-09-23 20:56:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:56:56.971781 | orchestrator | 2025-09-23 20:56:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:00.026269 | orchestrator | 2025-09-23 20:57:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:00.027573 | orchestrator | 2025-09-23 20:57:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:00.027614 | orchestrator | 2025-09-23 20:57:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:03.072919 | orchestrator | 2025-09-23 20:57:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:03.074254 | orchestrator | 2025-09-23 20:57:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:03.074312 | orchestrator | 2025-09-23 20:57:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:06.125504 | orchestrator | 2025-09-23 20:57:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:06.126129 | orchestrator | 2025-09-23 20:57:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:06.126171 | orchestrator | 2025-09-23 20:57:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:09.171676 | orchestrator | 2025-09-23 20:57:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:09.172944 | orchestrator | 2025-09-23 20:57:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:09.172993 | orchestrator | 2025-09-23 20:57:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:12.221584 | orchestrator | 2025-09-23 20:57:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:12.223418 | orchestrator | 2025-09-23 20:57:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:12.223895 | orchestrator | 2025-09-23 20:57:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:15.268161 | orchestrator | 2025-09-23 20:57:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:15.269103 | orchestrator | 2025-09-23 20:57:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:15.269134 | orchestrator | 2025-09-23 20:57:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:18.315647 | orchestrator | 2025-09-23 20:57:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:18.318269 | orchestrator | 2025-09-23 20:57:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:18.318323 | orchestrator | 2025-09-23 20:57:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:21.371213 | orchestrator | 2025-09-23 20:57:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:21.372120 | orchestrator | 2025-09-23 20:57:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:21.372153 | orchestrator | 2025-09-23 20:57:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:24.417344 | orchestrator | 2025-09-23 20:57:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:24.418426 | orchestrator | 2025-09-23 20:57:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:24.418527 | orchestrator | 2025-09-23 20:57:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:27.463927 | orchestrator | 2025-09-23 20:57:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:27.466329 | orchestrator | 2025-09-23 20:57:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:27.466791 | orchestrator | 2025-09-23 20:57:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:30.510876 | orchestrator | 2025-09-23 20:57:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:30.512500 | orchestrator | 2025-09-23 20:57:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:30.512528 | orchestrator | 2025-09-23 20:57:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:33.554105 | orchestrator | 2025-09-23 20:57:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:33.555636 | orchestrator | 2025-09-23 20:57:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:33.555673 | orchestrator | 2025-09-23 20:57:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:36.603508 | orchestrator | 2025-09-23 20:57:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:36.603709 | orchestrator | 2025-09-23 20:57:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:36.603731 | orchestrator | 2025-09-23 20:57:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:39.652113 | orchestrator | 2025-09-23 20:57:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:39.653995 | orchestrator | 2025-09-23 20:57:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:39.654078 | orchestrator | 2025-09-23 20:57:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:42.697136 | orchestrator | 2025-09-23 20:57:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:42.697876 | orchestrator | 2025-09-23 20:57:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:42.697909 | orchestrator | 2025-09-23 20:57:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:45.741910 | orchestrator | 2025-09-23 20:57:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:45.742749 | orchestrator | 2025-09-23 20:57:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:45.742833 | orchestrator | 2025-09-23 20:57:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:48.782868 | orchestrator | 2025-09-23 20:57:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:48.783517 | orchestrator | 2025-09-23 20:57:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:48.783832 | orchestrator | 2025-09-23 20:57:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:51.829087 | orchestrator | 2025-09-23 20:57:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:51.831134 | orchestrator | 2025-09-23 20:57:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:51.831230 | orchestrator | 2025-09-23 20:57:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:54.880527 | orchestrator | 2025-09-23 20:57:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:54.881891 | orchestrator | 2025-09-23 20:57:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:54.881934 | orchestrator | 2025-09-23 20:57:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:57:57.924353 | orchestrator | 2025-09-23 20:57:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:57:57.925504 | orchestrator | 2025-09-23 20:57:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:57:57.925536 | orchestrator | 2025-09-23 20:57:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:00.970968 | orchestrator | 2025-09-23 20:58:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:00.972393 | orchestrator | 2025-09-23 20:58:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:00.972677 | orchestrator | 2025-09-23 20:58:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:04.016907 | orchestrator | 2025-09-23 20:58:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:04.018802 | orchestrator | 2025-09-23 20:58:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:04.018928 | orchestrator | 2025-09-23 20:58:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:07.067491 | orchestrator | 2025-09-23 20:58:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:07.068945 | orchestrator | 2025-09-23 20:58:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:07.068997 | orchestrator | 2025-09-23 20:58:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:10.113458 | orchestrator | 2025-09-23 20:58:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:10.114702 | orchestrator | 2025-09-23 20:58:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:10.114745 | orchestrator | 2025-09-23 20:58:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:13.163401 | orchestrator | 2025-09-23 20:58:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:13.164428 | orchestrator | 2025-09-23 20:58:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:13.164461 | orchestrator | 2025-09-23 20:58:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:16.211574 | orchestrator | 2025-09-23 20:58:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:16.213944 | orchestrator | 2025-09-23 20:58:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:16.213991 | orchestrator | 2025-09-23 20:58:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:19.258191 | orchestrator | 2025-09-23 20:58:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:19.261267 | orchestrator | 2025-09-23 20:58:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:19.261364 | orchestrator | 2025-09-23 20:58:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:22.306710 | orchestrator | 2025-09-23 20:58:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:22.308276 | orchestrator | 2025-09-23 20:58:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:22.308307 | orchestrator | 2025-09-23 20:58:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:25.356756 | orchestrator | 2025-09-23 20:58:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:25.358124 | orchestrator | 2025-09-23 20:58:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:25.358206 | orchestrator | 2025-09-23 20:58:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:28.401011 | orchestrator | 2025-09-23 20:58:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:28.403512 | orchestrator | 2025-09-23 20:58:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:28.403547 | orchestrator | 2025-09-23 20:58:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:31.448781 | orchestrator | 2025-09-23 20:58:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:31.449896 | orchestrator | 2025-09-23 20:58:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:31.449943 | orchestrator | 2025-09-23 20:58:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:34.491926 | orchestrator | 2025-09-23 20:58:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:34.493167 | orchestrator | 2025-09-23 20:58:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:34.493201 | orchestrator | 2025-09-23 20:58:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:37.538352 | orchestrator | 2025-09-23 20:58:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:37.539923 | orchestrator | 2025-09-23 20:58:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:37.540008 | orchestrator | 2025-09-23 20:58:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:40.592056 | orchestrator | 2025-09-23 20:58:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:40.595022 | orchestrator | 2025-09-23 20:58:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:40.595089 | orchestrator | 2025-09-23 20:58:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:43.647079 | orchestrator | 2025-09-23 20:58:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:43.649886 | orchestrator | 2025-09-23 20:58:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:43.649918 | orchestrator | 2025-09-23 20:58:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:46.700927 | orchestrator | 2025-09-23 20:58:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:46.701512 | orchestrator | 2025-09-23 20:58:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:46.701546 | orchestrator | 2025-09-23 20:58:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:49.745887 | orchestrator | 2025-09-23 20:58:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:49.747847 | orchestrator | 2025-09-23 20:58:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:49.747881 | orchestrator | 2025-09-23 20:58:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:52.789608 | orchestrator | 2025-09-23 20:58:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:52.791604 | orchestrator | 2025-09-23 20:58:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:52.791726 | orchestrator | 2025-09-23 20:58:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:55.836916 | orchestrator | 2025-09-23 20:58:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:55.837520 | orchestrator | 2025-09-23 20:58:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:55.837546 | orchestrator | 2025-09-23 20:58:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:58:58.880056 | orchestrator | 2025-09-23 20:58:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:58:58.881565 | orchestrator | 2025-09-23 20:58:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:58:58.881892 | orchestrator | 2025-09-23 20:58:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:01.926925 | orchestrator | 2025-09-23 20:59:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:01.928324 | orchestrator | 2025-09-23 20:59:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:01.928372 | orchestrator | 2025-09-23 20:59:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:04.973755 | orchestrator | 2025-09-23 20:59:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:04.975178 | orchestrator | 2025-09-23 20:59:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:04.975212 | orchestrator | 2025-09-23 20:59:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:08.021724 | orchestrator | 2025-09-23 20:59:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:08.024190 | orchestrator | 2025-09-23 20:59:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:08.024325 | orchestrator | 2025-09-23 20:59:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:11.073887 | orchestrator | 2025-09-23 20:59:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:11.075638 | orchestrator | 2025-09-23 20:59:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:11.075674 | orchestrator | 2025-09-23 20:59:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:14.124976 | orchestrator | 2025-09-23 20:59:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:14.126142 | orchestrator | 2025-09-23 20:59:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:14.126225 | orchestrator | 2025-09-23 20:59:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:17.172518 | orchestrator | 2025-09-23 20:59:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:17.175476 | orchestrator | 2025-09-23 20:59:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:17.176014 | orchestrator | 2025-09-23 20:59:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:20.218057 | orchestrator | 2025-09-23 20:59:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:20.219063 | orchestrator | 2025-09-23 20:59:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:20.219094 | orchestrator | 2025-09-23 20:59:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:23.259157 | orchestrator | 2025-09-23 20:59:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:23.260572 | orchestrator | 2025-09-23 20:59:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:23.260604 | orchestrator | 2025-09-23 20:59:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:26.310780 | orchestrator | 2025-09-23 20:59:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:26.312344 | orchestrator | 2025-09-23 20:59:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:26.312547 | orchestrator | 2025-09-23 20:59:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:29.362969 | orchestrator | 2025-09-23 20:59:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:29.365500 | orchestrator | 2025-09-23 20:59:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:29.365615 | orchestrator | 2025-09-23 20:59:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:32.413597 | orchestrator | 2025-09-23 20:59:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:32.416479 | orchestrator | 2025-09-23 20:59:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:32.416524 | orchestrator | 2025-09-23 20:59:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:35.465675 | orchestrator | 2025-09-23 20:59:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:35.466115 | orchestrator | 2025-09-23 20:59:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:35.466710 | orchestrator | 2025-09-23 20:59:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:38.515907 | orchestrator | 2025-09-23 20:59:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:38.518313 | orchestrator | 2025-09-23 20:59:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:38.518352 | orchestrator | 2025-09-23 20:59:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:41.565312 | orchestrator | 2025-09-23 20:59:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:41.566345 | orchestrator | 2025-09-23 20:59:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:41.566378 | orchestrator | 2025-09-23 20:59:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:44.611292 | orchestrator | 2025-09-23 20:59:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:44.613020 | orchestrator | 2025-09-23 20:59:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:44.613229 | orchestrator | 2025-09-23 20:59:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:47.658333 | orchestrator | 2025-09-23 20:59:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:47.660245 | orchestrator | 2025-09-23 20:59:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:47.660475 | orchestrator | 2025-09-23 20:59:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:50.708066 | orchestrator | 2025-09-23 20:59:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:50.711361 | orchestrator | 2025-09-23 20:59:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:50.711638 | orchestrator | 2025-09-23 20:59:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:53.760059 | orchestrator | 2025-09-23 20:59:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:53.762006 | orchestrator | 2025-09-23 20:59:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:53.762228 | orchestrator | 2025-09-23 20:59:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:56.816281 | orchestrator | 2025-09-23 20:59:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:56.817958 | orchestrator | 2025-09-23 20:59:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:56.818104 | orchestrator | 2025-09-23 20:59:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 20:59:59.849983 | orchestrator | 2025-09-23 20:59:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 20:59:59.851100 | orchestrator | 2025-09-23 20:59:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 20:59:59.851141 | orchestrator | 2025-09-23 20:59:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:02.896835 | orchestrator | 2025-09-23 21:00:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:02.897635 | orchestrator | 2025-09-23 21:00:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:02.897665 | orchestrator | 2025-09-23 21:00:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:05.941941 | orchestrator | 2025-09-23 21:00:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:05.943340 | orchestrator | 2025-09-23 21:00:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:05.943617 | orchestrator | 2025-09-23 21:00:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:08.987661 | orchestrator | 2025-09-23 21:00:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:08.990227 | orchestrator | 2025-09-23 21:00:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:08.990327 | orchestrator | 2025-09-23 21:00:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:12.031425 | orchestrator | 2025-09-23 21:00:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:12.033133 | orchestrator | 2025-09-23 21:00:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:12.033317 | orchestrator | 2025-09-23 21:00:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:15.068961 | orchestrator | 2025-09-23 21:00:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:15.070114 | orchestrator | 2025-09-23 21:00:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:15.070175 | orchestrator | 2025-09-23 21:00:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:18.114949 | orchestrator | 2025-09-23 21:00:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:18.117749 | orchestrator | 2025-09-23 21:00:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:18.117780 | orchestrator | 2025-09-23 21:00:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:21.165020 | orchestrator | 2025-09-23 21:00:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:21.168251 | orchestrator | 2025-09-23 21:00:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:21.168306 | orchestrator | 2025-09-23 21:00:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:24.216471 | orchestrator | 2025-09-23 21:00:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:24.217965 | orchestrator | 2025-09-23 21:00:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:24.218007 | orchestrator | 2025-09-23 21:00:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:27.264730 | orchestrator | 2025-09-23 21:00:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:27.266087 | orchestrator | 2025-09-23 21:00:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:27.266259 | orchestrator | 2025-09-23 21:00:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:30.310256 | orchestrator | 2025-09-23 21:00:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:30.312359 | orchestrator | 2025-09-23 21:00:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:30.312761 | orchestrator | 2025-09-23 21:00:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:33.359654 | orchestrator | 2025-09-23 21:00:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:33.361146 | orchestrator | 2025-09-23 21:00:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:33.361193 | orchestrator | 2025-09-23 21:00:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:36.404973 | orchestrator | 2025-09-23 21:00:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:36.406363 | orchestrator | 2025-09-23 21:00:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:36.406506 | orchestrator | 2025-09-23 21:00:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:39.460783 | orchestrator | 2025-09-23 21:00:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:39.462870 | orchestrator | 2025-09-23 21:00:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:39.462944 | orchestrator | 2025-09-23 21:00:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:42.512140 | orchestrator | 2025-09-23 21:00:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:42.513777 | orchestrator | 2025-09-23 21:00:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:42.513856 | orchestrator | 2025-09-23 21:00:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:45.558306 | orchestrator | 2025-09-23 21:00:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:45.560173 | orchestrator | 2025-09-23 21:00:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:45.560211 | orchestrator | 2025-09-23 21:00:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:48.606420 | orchestrator | 2025-09-23 21:00:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:48.607534 | orchestrator | 2025-09-23 21:00:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:48.607567 | orchestrator | 2025-09-23 21:00:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:51.655865 | orchestrator | 2025-09-23 21:00:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:51.658095 | orchestrator | 2025-09-23 21:00:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:51.658161 | orchestrator | 2025-09-23 21:00:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:54.701774 | orchestrator | 2025-09-23 21:00:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:54.704249 | orchestrator | 2025-09-23 21:00:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:54.704291 | orchestrator | 2025-09-23 21:00:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:00:57.751794 | orchestrator | 2025-09-23 21:00:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:00:57.753069 | orchestrator | 2025-09-23 21:00:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:00:57.753154 | orchestrator | 2025-09-23 21:00:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:00.795708 | orchestrator | 2025-09-23 21:01:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:00.798662 | orchestrator | 2025-09-23 21:01:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:00.798696 | orchestrator | 2025-09-23 21:01:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:03.837683 | orchestrator | 2025-09-23 21:01:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:03.838754 | orchestrator | 2025-09-23 21:01:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:03.838923 | orchestrator | 2025-09-23 21:01:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:06.888167 | orchestrator | 2025-09-23 21:01:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:06.889680 | orchestrator | 2025-09-23 21:01:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:06.889782 | orchestrator | 2025-09-23 21:01:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:09.932213 | orchestrator | 2025-09-23 21:01:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:09.933787 | orchestrator | 2025-09-23 21:01:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:09.934994 | orchestrator | 2025-09-23 21:01:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:12.982877 | orchestrator | 2025-09-23 21:01:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:12.984997 | orchestrator | 2025-09-23 21:01:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:12.985030 | orchestrator | 2025-09-23 21:01:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:16.032156 | orchestrator | 2025-09-23 21:01:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:16.033527 | orchestrator | 2025-09-23 21:01:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:16.033624 | orchestrator | 2025-09-23 21:01:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:19.080663 | orchestrator | 2025-09-23 21:01:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:19.081901 | orchestrator | 2025-09-23 21:01:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:19.082132 | orchestrator | 2025-09-23 21:01:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:22.127547 | orchestrator | 2025-09-23 21:01:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:22.129111 | orchestrator | 2025-09-23 21:01:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:22.129409 | orchestrator | 2025-09-23 21:01:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:25.175016 | orchestrator | 2025-09-23 21:01:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:25.175347 | orchestrator | 2025-09-23 21:01:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:25.175707 | orchestrator | 2025-09-23 21:01:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:28.218746 | orchestrator | 2025-09-23 21:01:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:28.220702 | orchestrator | 2025-09-23 21:01:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:28.220736 | orchestrator | 2025-09-23 21:01:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:31.264425 | orchestrator | 2025-09-23 21:01:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:31.266921 | orchestrator | 2025-09-23 21:01:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:31.267090 | orchestrator | 2025-09-23 21:01:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:34.312009 | orchestrator | 2025-09-23 21:01:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:34.314615 | orchestrator | 2025-09-23 21:01:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:34.315010 | orchestrator | 2025-09-23 21:01:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:37.356150 | orchestrator | 2025-09-23 21:01:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:37.357684 | orchestrator | 2025-09-23 21:01:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:37.357743 | orchestrator | 2025-09-23 21:01:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:40.400724 | orchestrator | 2025-09-23 21:01:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:40.402134 | orchestrator | 2025-09-23 21:01:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:40.402172 | orchestrator | 2025-09-23 21:01:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:43.444741 | orchestrator | 2025-09-23 21:01:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:43.446214 | orchestrator | 2025-09-23 21:01:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:43.446250 | orchestrator | 2025-09-23 21:01:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:46.494067 | orchestrator | 2025-09-23 21:01:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:46.496175 | orchestrator | 2025-09-23 21:01:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:46.496793 | orchestrator | 2025-09-23 21:01:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:49.538420 | orchestrator | 2025-09-23 21:01:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:49.540236 | orchestrator | 2025-09-23 21:01:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:49.540268 | orchestrator | 2025-09-23 21:01:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:52.584865 | orchestrator | 2025-09-23 21:01:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:52.586448 | orchestrator | 2025-09-23 21:01:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:52.586695 | orchestrator | 2025-09-23 21:01:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:55.635237 | orchestrator | 2025-09-23 21:01:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:55.638471 | orchestrator | 2025-09-23 21:01:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:55.638510 | orchestrator | 2025-09-23 21:01:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:01:58.685209 | orchestrator | 2025-09-23 21:01:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:01:58.686485 | orchestrator | 2025-09-23 21:01:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:01:58.686518 | orchestrator | 2025-09-23 21:01:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:01.730518 | orchestrator | 2025-09-23 21:02:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:01.731923 | orchestrator | 2025-09-23 21:02:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:01.732057 | orchestrator | 2025-09-23 21:02:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:04.779381 | orchestrator | 2025-09-23 21:02:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:04.780974 | orchestrator | 2025-09-23 21:02:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:04.781132 | orchestrator | 2025-09-23 21:02:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:07.820545 | orchestrator | 2025-09-23 21:02:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:07.822228 | orchestrator | 2025-09-23 21:02:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:07.822317 | orchestrator | 2025-09-23 21:02:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:10.871094 | orchestrator | 2025-09-23 21:02:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:10.872222 | orchestrator | 2025-09-23 21:02:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:10.872674 | orchestrator | 2025-09-23 21:02:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:13.915790 | orchestrator | 2025-09-23 21:02:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:13.918209 | orchestrator | 2025-09-23 21:02:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:13.918314 | orchestrator | 2025-09-23 21:02:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:16.959428 | orchestrator | 2025-09-23 21:02:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:16.960483 | orchestrator | 2025-09-23 21:02:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:16.960516 | orchestrator | 2025-09-23 21:02:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:20.008324 | orchestrator | 2025-09-23 21:02:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:20.009708 | orchestrator | 2025-09-23 21:02:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:20.009780 | orchestrator | 2025-09-23 21:02:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:23.061408 | orchestrator | 2025-09-23 21:02:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:23.062646 | orchestrator | 2025-09-23 21:02:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:23.062678 | orchestrator | 2025-09-23 21:02:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:26.110244 | orchestrator | 2025-09-23 21:02:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:26.111690 | orchestrator | 2025-09-23 21:02:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:26.111724 | orchestrator | 2025-09-23 21:02:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:29.154227 | orchestrator | 2025-09-23 21:02:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:29.158113 | orchestrator | 2025-09-23 21:02:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:29.158200 | orchestrator | 2025-09-23 21:02:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:32.205666 | orchestrator | 2025-09-23 21:02:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:32.206719 | orchestrator | 2025-09-23 21:02:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:32.206746 | orchestrator | 2025-09-23 21:02:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:35.252219 | orchestrator | 2025-09-23 21:02:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:35.254530 | orchestrator | 2025-09-23 21:02:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:35.254818 | orchestrator | 2025-09-23 21:02:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:38.294197 | orchestrator | 2025-09-23 21:02:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:38.295613 | orchestrator | 2025-09-23 21:02:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:38.295661 | orchestrator | 2025-09-23 21:02:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:41.338307 | orchestrator | 2025-09-23 21:02:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:41.340284 | orchestrator | 2025-09-23 21:02:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:41.340365 | orchestrator | 2025-09-23 21:02:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:44.388369 | orchestrator | 2025-09-23 21:02:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:44.391343 | orchestrator | 2025-09-23 21:02:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:44.391379 | orchestrator | 2025-09-23 21:02:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:47.432713 | orchestrator | 2025-09-23 21:02:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:47.434116 | orchestrator | 2025-09-23 21:02:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:47.434300 | orchestrator | 2025-09-23 21:02:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:50.479803 | orchestrator | 2025-09-23 21:02:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:50.481889 | orchestrator | 2025-09-23 21:02:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:50.482175 | orchestrator | 2025-09-23 21:02:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:53.522707 | orchestrator | 2025-09-23 21:02:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:53.524443 | orchestrator | 2025-09-23 21:02:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:53.524684 | orchestrator | 2025-09-23 21:02:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:56.563514 | orchestrator | 2025-09-23 21:02:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:56.564560 | orchestrator | 2025-09-23 21:02:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:56.564599 | orchestrator | 2025-09-23 21:02:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:02:59.609925 | orchestrator | 2025-09-23 21:02:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:02:59.611226 | orchestrator | 2025-09-23 21:02:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:02:59.611435 | orchestrator | 2025-09-23 21:02:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:02.657109 | orchestrator | 2025-09-23 21:03:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:02.659478 | orchestrator | 2025-09-23 21:03:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:02.659521 | orchestrator | 2025-09-23 21:03:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:05.710722 | orchestrator | 2025-09-23 21:03:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:05.713491 | orchestrator | 2025-09-23 21:03:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:05.713576 | orchestrator | 2025-09-23 21:03:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:08.754738 | orchestrator | 2025-09-23 21:03:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:08.755900 | orchestrator | 2025-09-23 21:03:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:08.756004 | orchestrator | 2025-09-23 21:03:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:11.796222 | orchestrator | 2025-09-23 21:03:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:11.797724 | orchestrator | 2025-09-23 21:03:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:11.797757 | orchestrator | 2025-09-23 21:03:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:14.841803 | orchestrator | 2025-09-23 21:03:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:14.843506 | orchestrator | 2025-09-23 21:03:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:14.843590 | orchestrator | 2025-09-23 21:03:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:17.888453 | orchestrator | 2025-09-23 21:03:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:17.890741 | orchestrator | 2025-09-23 21:03:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:17.891025 | orchestrator | 2025-09-23 21:03:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:20.933337 | orchestrator | 2025-09-23 21:03:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:20.933432 | orchestrator | 2025-09-23 21:03:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:20.933447 | orchestrator | 2025-09-23 21:03:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:23.984225 | orchestrator | 2025-09-23 21:03:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:23.986719 | orchestrator | 2025-09-23 21:03:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:23.986752 | orchestrator | 2025-09-23 21:03:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:27.035735 | orchestrator | 2025-09-23 21:03:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:27.036913 | orchestrator | 2025-09-23 21:03:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:27.036947 | orchestrator | 2025-09-23 21:03:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:30.085877 | orchestrator | 2025-09-23 21:03:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:30.087933 | orchestrator | 2025-09-23 21:03:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:30.087969 | orchestrator | 2025-09-23 21:03:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:33.135382 | orchestrator | 2025-09-23 21:03:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:33.136576 | orchestrator | 2025-09-23 21:03:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:33.136694 | orchestrator | 2025-09-23 21:03:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:36.183182 | orchestrator | 2025-09-23 21:03:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:36.185543 | orchestrator | 2025-09-23 21:03:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:36.185674 | orchestrator | 2025-09-23 21:03:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:39.231521 | orchestrator | 2025-09-23 21:03:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:39.232117 | orchestrator | 2025-09-23 21:03:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:39.232397 | orchestrator | 2025-09-23 21:03:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:42.274286 | orchestrator | 2025-09-23 21:03:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:42.276195 | orchestrator | 2025-09-23 21:03:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:42.276265 | orchestrator | 2025-09-23 21:03:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:45.326810 | orchestrator | 2025-09-23 21:03:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:45.328685 | orchestrator | 2025-09-23 21:03:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:45.328793 | orchestrator | 2025-09-23 21:03:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:48.372219 | orchestrator | 2025-09-23 21:03:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:48.373377 | orchestrator | 2025-09-23 21:03:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:48.373459 | orchestrator | 2025-09-23 21:03:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:51.415364 | orchestrator | 2025-09-23 21:03:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:51.417489 | orchestrator | 2025-09-23 21:03:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:51.417537 | orchestrator | 2025-09-23 21:03:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:54.463870 | orchestrator | 2025-09-23 21:03:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:54.465125 | orchestrator | 2025-09-23 21:03:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:54.465159 | orchestrator | 2025-09-23 21:03:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:03:57.505937 | orchestrator | 2025-09-23 21:03:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:03:57.506907 | orchestrator | 2025-09-23 21:03:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:03:57.506993 | orchestrator | 2025-09-23 21:03:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:00.553960 | orchestrator | 2025-09-23 21:04:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:00.554851 | orchestrator | 2025-09-23 21:04:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:00.554888 | orchestrator | 2025-09-23 21:04:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:03.596694 | orchestrator | 2025-09-23 21:04:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:03.597638 | orchestrator | 2025-09-23 21:04:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:03.597670 | orchestrator | 2025-09-23 21:04:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:06.644995 | orchestrator | 2025-09-23 21:04:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:06.647008 | orchestrator | 2025-09-23 21:04:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:06.647147 | orchestrator | 2025-09-23 21:04:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:09.697970 | orchestrator | 2025-09-23 21:04:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:09.699800 | orchestrator | 2025-09-23 21:04:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:09.699889 | orchestrator | 2025-09-23 21:04:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:12.746732 | orchestrator | 2025-09-23 21:04:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:12.748759 | orchestrator | 2025-09-23 21:04:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:12.749012 | orchestrator | 2025-09-23 21:04:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:15.798660 | orchestrator | 2025-09-23 21:04:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:15.800670 | orchestrator | 2025-09-23 21:04:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:15.800735 | orchestrator | 2025-09-23 21:04:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:18.849880 | orchestrator | 2025-09-23 21:04:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:18.851561 | orchestrator | 2025-09-23 21:04:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:18.851603 | orchestrator | 2025-09-23 21:04:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:21.898352 | orchestrator | 2025-09-23 21:04:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:21.900224 | orchestrator | 2025-09-23 21:04:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:21.900516 | orchestrator | 2025-09-23 21:04:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:24.949347 | orchestrator | 2025-09-23 21:04:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:24.950750 | orchestrator | 2025-09-23 21:04:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:24.950935 | orchestrator | 2025-09-23 21:04:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:27.997931 | orchestrator | 2025-09-23 21:04:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:28.000042 | orchestrator | 2025-09-23 21:04:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:28.000087 | orchestrator | 2025-09-23 21:04:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:31.047026 | orchestrator | 2025-09-23 21:04:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:31.048707 | orchestrator | 2025-09-23 21:04:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:31.048745 | orchestrator | 2025-09-23 21:04:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:34.098161 | orchestrator | 2025-09-23 21:04:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:34.099279 | orchestrator | 2025-09-23 21:04:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:34.099314 | orchestrator | 2025-09-23 21:04:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:37.143969 | orchestrator | 2025-09-23 21:04:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:37.147216 | orchestrator | 2025-09-23 21:04:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:37.147257 | orchestrator | 2025-09-23 21:04:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:40.193348 | orchestrator | 2025-09-23 21:04:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:40.195252 | orchestrator | 2025-09-23 21:04:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:40.195312 | orchestrator | 2025-09-23 21:04:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:43.244918 | orchestrator | 2025-09-23 21:04:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:43.246431 | orchestrator | 2025-09-23 21:04:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:43.246481 | orchestrator | 2025-09-23 21:04:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:46.291698 | orchestrator | 2025-09-23 21:04:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:46.293601 | orchestrator | 2025-09-23 21:04:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:46.293685 | orchestrator | 2025-09-23 21:04:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:49.337484 | orchestrator | 2025-09-23 21:04:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:49.338956 | orchestrator | 2025-09-23 21:04:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:49.338991 | orchestrator | 2025-09-23 21:04:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:52.382595 | orchestrator | 2025-09-23 21:04:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:52.382994 | orchestrator | 2025-09-23 21:04:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:52.383025 | orchestrator | 2025-09-23 21:04:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:55.434653 | orchestrator | 2025-09-23 21:04:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:55.435804 | orchestrator | 2025-09-23 21:04:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:55.435921 | orchestrator | 2025-09-23 21:04:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:04:58.480499 | orchestrator | 2025-09-23 21:04:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:04:58.481634 | orchestrator | 2025-09-23 21:04:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:04:58.481867 | orchestrator | 2025-09-23 21:04:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:01.522432 | orchestrator | 2025-09-23 21:05:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:01.524436 | orchestrator | 2025-09-23 21:05:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:01.524514 | orchestrator | 2025-09-23 21:05:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:04.569794 | orchestrator | 2025-09-23 21:05:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:04.570887 | orchestrator | 2025-09-23 21:05:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:04.570974 | orchestrator | 2025-09-23 21:05:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:07.618615 | orchestrator | 2025-09-23 21:05:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:07.620778 | orchestrator | 2025-09-23 21:05:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:07.621046 | orchestrator | 2025-09-23 21:05:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:10.667291 | orchestrator | 2025-09-23 21:05:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:10.668772 | orchestrator | 2025-09-23 21:05:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:10.668815 | orchestrator | 2025-09-23 21:05:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:13.710941 | orchestrator | 2025-09-23 21:05:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:13.712481 | orchestrator | 2025-09-23 21:05:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:13.712539 | orchestrator | 2025-09-23 21:05:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:16.761471 | orchestrator | 2025-09-23 21:05:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:16.762084 | orchestrator | 2025-09-23 21:05:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:16.762119 | orchestrator | 2025-09-23 21:05:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:19.811835 | orchestrator | 2025-09-23 21:05:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:19.813074 | orchestrator | 2025-09-23 21:05:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:19.813311 | orchestrator | 2025-09-23 21:05:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:22.860477 | orchestrator | 2025-09-23 21:05:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:22.861967 | orchestrator | 2025-09-23 21:05:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:22.862289 | orchestrator | 2025-09-23 21:05:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:25.907422 | orchestrator | 2025-09-23 21:05:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:25.909694 | orchestrator | 2025-09-23 21:05:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:25.909785 | orchestrator | 2025-09-23 21:05:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:28.949282 | orchestrator | 2025-09-23 21:05:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:28.952202 | orchestrator | 2025-09-23 21:05:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:28.952260 | orchestrator | 2025-09-23 21:05:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:31.996597 | orchestrator | 2025-09-23 21:05:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:31.998135 | orchestrator | 2025-09-23 21:05:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:31.998205 | orchestrator | 2025-09-23 21:05:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:35.047002 | orchestrator | 2025-09-23 21:05:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:35.049692 | orchestrator | 2025-09-23 21:05:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:35.049728 | orchestrator | 2025-09-23 21:05:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:38.091814 | orchestrator | 2025-09-23 21:05:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:38.092748 | orchestrator | 2025-09-23 21:05:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:38.092826 | orchestrator | 2025-09-23 21:05:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:41.134696 | orchestrator | 2025-09-23 21:05:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:41.136882 | orchestrator | 2025-09-23 21:05:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:41.136928 | orchestrator | 2025-09-23 21:05:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:44.175330 | orchestrator | 2025-09-23 21:05:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:44.176391 | orchestrator | 2025-09-23 21:05:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:44.176472 | orchestrator | 2025-09-23 21:05:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:47.222880 | orchestrator | 2025-09-23 21:05:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:47.224546 | orchestrator | 2025-09-23 21:05:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:47.224709 | orchestrator | 2025-09-23 21:05:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:50.268669 | orchestrator | 2025-09-23 21:05:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:50.270792 | orchestrator | 2025-09-23 21:05:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:50.270839 | orchestrator | 2025-09-23 21:05:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:53.312661 | orchestrator | 2025-09-23 21:05:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:53.314213 | orchestrator | 2025-09-23 21:05:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:53.314265 | orchestrator | 2025-09-23 21:05:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:56.358879 | orchestrator | 2025-09-23 21:05:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:56.359842 | orchestrator | 2025-09-23 21:05:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:56.359868 | orchestrator | 2025-09-23 21:05:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:05:59.402104 | orchestrator | 2025-09-23 21:05:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:05:59.404612 | orchestrator | 2025-09-23 21:05:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:05:59.404685 | orchestrator | 2025-09-23 21:05:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:02.450438 | orchestrator | 2025-09-23 21:06:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:02.451985 | orchestrator | 2025-09-23 21:06:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:02.452009 | orchestrator | 2025-09-23 21:06:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:05.499506 | orchestrator | 2025-09-23 21:06:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:05.500570 | orchestrator | 2025-09-23 21:06:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:05.500684 | orchestrator | 2025-09-23 21:06:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:08.552323 | orchestrator | 2025-09-23 21:06:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:08.553692 | orchestrator | 2025-09-23 21:06:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:08.553726 | orchestrator | 2025-09-23 21:06:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:11.595827 | orchestrator | 2025-09-23 21:06:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:11.598461 | orchestrator | 2025-09-23 21:06:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:11.598496 | orchestrator | 2025-09-23 21:06:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:14.641607 | orchestrator | 2025-09-23 21:06:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:14.643338 | orchestrator | 2025-09-23 21:06:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:14.643431 | orchestrator | 2025-09-23 21:06:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:17.690000 | orchestrator | 2025-09-23 21:06:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:17.691980 | orchestrator | 2025-09-23 21:06:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:17.692063 | orchestrator | 2025-09-23 21:06:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:20.738377 | orchestrator | 2025-09-23 21:06:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:20.739434 | orchestrator | 2025-09-23 21:06:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:20.740034 | orchestrator | 2025-09-23 21:06:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:23.789611 | orchestrator | 2025-09-23 21:06:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:23.790850 | orchestrator | 2025-09-23 21:06:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:23.790882 | orchestrator | 2025-09-23 21:06:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:26.835845 | orchestrator | 2025-09-23 21:06:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:26.838814 | orchestrator | 2025-09-23 21:06:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:26.838848 | orchestrator | 2025-09-23 21:06:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:29.878379 | orchestrator | 2025-09-23 21:06:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:29.880392 | orchestrator | 2025-09-23 21:06:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:29.880441 | orchestrator | 2025-09-23 21:06:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:32.928800 | orchestrator | 2025-09-23 21:06:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:32.929515 | orchestrator | 2025-09-23 21:06:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:32.929882 | orchestrator | 2025-09-23 21:06:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:35.982686 | orchestrator | 2025-09-23 21:06:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:35.982785 | orchestrator | 2025-09-23 21:06:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:35.982800 | orchestrator | 2025-09-23 21:06:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:39.026982 | orchestrator | 2025-09-23 21:06:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:39.028678 | orchestrator | 2025-09-23 21:06:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:39.028711 | orchestrator | 2025-09-23 21:06:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:42.074639 | orchestrator | 2025-09-23 21:06:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:42.075801 | orchestrator | 2025-09-23 21:06:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:42.075852 | orchestrator | 2025-09-23 21:06:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:45.125389 | orchestrator | 2025-09-23 21:06:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:45.126845 | orchestrator | 2025-09-23 21:06:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:45.126934 | orchestrator | 2025-09-23 21:06:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:48.176824 | orchestrator | 2025-09-23 21:06:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:48.178692 | orchestrator | 2025-09-23 21:06:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:48.178805 | orchestrator | 2025-09-23 21:06:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:51.219570 | orchestrator | 2025-09-23 21:06:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:51.220826 | orchestrator | 2025-09-23 21:06:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:51.220940 | orchestrator | 2025-09-23 21:06:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:54.266515 | orchestrator | 2025-09-23 21:06:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:54.267782 | orchestrator | 2025-09-23 21:06:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:54.267842 | orchestrator | 2025-09-23 21:06:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:06:57.320369 | orchestrator | 2025-09-23 21:06:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:06:57.321886 | orchestrator | 2025-09-23 21:06:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:06:57.321920 | orchestrator | 2025-09-23 21:06:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:00.369824 | orchestrator | 2025-09-23 21:07:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:00.371434 | orchestrator | 2025-09-23 21:07:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:00.371469 | orchestrator | 2025-09-23 21:07:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:03.415979 | orchestrator | 2025-09-23 21:07:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:03.417879 | orchestrator | 2025-09-23 21:07:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:03.418109 | orchestrator | 2025-09-23 21:07:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:06.459373 | orchestrator | 2025-09-23 21:07:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:06.460926 | orchestrator | 2025-09-23 21:07:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:06.460957 | orchestrator | 2025-09-23 21:07:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:09.506362 | orchestrator | 2025-09-23 21:07:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:09.507425 | orchestrator | 2025-09-23 21:07:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:09.507462 | orchestrator | 2025-09-23 21:07:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:12.548494 | orchestrator | 2025-09-23 21:07:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:12.550742 | orchestrator | 2025-09-23 21:07:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:12.550776 | orchestrator | 2025-09-23 21:07:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:15.592914 | orchestrator | 2025-09-23 21:07:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:15.594418 | orchestrator | 2025-09-23 21:07:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:15.594457 | orchestrator | 2025-09-23 21:07:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:18.644220 | orchestrator | 2025-09-23 21:07:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:18.646105 | orchestrator | 2025-09-23 21:07:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:18.646137 | orchestrator | 2025-09-23 21:07:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:21.698973 | orchestrator | 2025-09-23 21:07:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:21.700584 | orchestrator | 2025-09-23 21:07:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:21.700618 | orchestrator | 2025-09-23 21:07:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:24.739772 | orchestrator | 2025-09-23 21:07:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:24.742247 | orchestrator | 2025-09-23 21:07:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:24.742383 | orchestrator | 2025-09-23 21:07:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:27.789598 | orchestrator | 2025-09-23 21:07:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:27.790469 | orchestrator | 2025-09-23 21:07:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:27.790510 | orchestrator | 2025-09-23 21:07:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:30.841043 | orchestrator | 2025-09-23 21:07:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:30.842713 | orchestrator | 2025-09-23 21:07:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:30.842749 | orchestrator | 2025-09-23 21:07:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:33.886917 | orchestrator | 2025-09-23 21:07:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:33.889495 | orchestrator | 2025-09-23 21:07:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:33.889888 | orchestrator | 2025-09-23 21:07:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:36.935710 | orchestrator | 2025-09-23 21:07:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:36.936776 | orchestrator | 2025-09-23 21:07:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:36.936899 | orchestrator | 2025-09-23 21:07:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:39.978366 | orchestrator | 2025-09-23 21:07:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:39.979854 | orchestrator | 2025-09-23 21:07:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:39.979940 | orchestrator | 2025-09-23 21:07:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:43.026637 | orchestrator | 2025-09-23 21:07:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:43.028770 | orchestrator | 2025-09-23 21:07:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:43.028803 | orchestrator | 2025-09-23 21:07:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:46.073036 | orchestrator | 2025-09-23 21:07:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:46.074243 | orchestrator | 2025-09-23 21:07:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:46.074329 | orchestrator | 2025-09-23 21:07:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:49.118714 | orchestrator | 2025-09-23 21:07:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:49.121033 | orchestrator | 2025-09-23 21:07:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:49.121199 | orchestrator | 2025-09-23 21:07:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:52.167363 | orchestrator | 2025-09-23 21:07:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:52.169834 | orchestrator | 2025-09-23 21:07:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:52.170064 | orchestrator | 2025-09-23 21:07:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:55.216467 | orchestrator | 2025-09-23 21:07:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:55.218220 | orchestrator | 2025-09-23 21:07:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:55.218260 | orchestrator | 2025-09-23 21:07:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:07:58.259371 | orchestrator | 2025-09-23 21:07:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:07:58.261922 | orchestrator | 2025-09-23 21:07:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:07:58.261961 | orchestrator | 2025-09-23 21:07:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:01.315075 | orchestrator | 2025-09-23 21:08:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:01.316808 | orchestrator | 2025-09-23 21:08:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:01.316885 | orchestrator | 2025-09-23 21:08:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:04.356509 | orchestrator | 2025-09-23 21:08:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:04.357781 | orchestrator | 2025-09-23 21:08:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:04.358013 | orchestrator | 2025-09-23 21:08:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:07.399403 | orchestrator | 2025-09-23 21:08:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:07.401623 | orchestrator | 2025-09-23 21:08:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:07.402090 | orchestrator | 2025-09-23 21:08:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:10.447634 | orchestrator | 2025-09-23 21:08:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:10.448067 | orchestrator | 2025-09-23 21:08:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:10.448102 | orchestrator | 2025-09-23 21:08:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:13.494665 | orchestrator | 2025-09-23 21:08:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:13.496100 | orchestrator | 2025-09-23 21:08:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:13.496222 | orchestrator | 2025-09-23 21:08:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:16.539163 | orchestrator | 2025-09-23 21:08:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:16.539956 | orchestrator | 2025-09-23 21:08:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:16.540050 | orchestrator | 2025-09-23 21:08:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:19.578225 | orchestrator | 2025-09-23 21:08:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:19.579689 | orchestrator | 2025-09-23 21:08:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:19.579722 | orchestrator | 2025-09-23 21:08:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:22.621268 | orchestrator | 2025-09-23 21:08:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:22.622480 | orchestrator | 2025-09-23 21:08:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:22.622533 | orchestrator | 2025-09-23 21:08:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:25.663492 | orchestrator | 2025-09-23 21:08:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:25.664848 | orchestrator | 2025-09-23 21:08:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:25.665016 | orchestrator | 2025-09-23 21:08:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:28.710806 | orchestrator | 2025-09-23 21:08:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:28.711649 | orchestrator | 2025-09-23 21:08:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:28.711686 | orchestrator | 2025-09-23 21:08:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:31.759202 | orchestrator | 2025-09-23 21:08:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:31.760672 | orchestrator | 2025-09-23 21:08:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:31.760704 | orchestrator | 2025-09-23 21:08:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:34.805150 | orchestrator | 2025-09-23 21:08:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:34.806886 | orchestrator | 2025-09-23 21:08:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:34.806953 | orchestrator | 2025-09-23 21:08:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:37.849227 | orchestrator | 2025-09-23 21:08:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:37.850347 | orchestrator | 2025-09-23 21:08:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:37.850429 | orchestrator | 2025-09-23 21:08:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:40.892182 | orchestrator | 2025-09-23 21:08:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:40.893403 | orchestrator | 2025-09-23 21:08:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:40.893466 | orchestrator | 2025-09-23 21:08:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:43.936580 | orchestrator | 2025-09-23 21:08:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:43.938128 | orchestrator | 2025-09-23 21:08:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:43.938469 | orchestrator | 2025-09-23 21:08:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:46.982751 | orchestrator | 2025-09-23 21:08:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:46.985913 | orchestrator | 2025-09-23 21:08:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:46.986146 | orchestrator | 2025-09-23 21:08:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:50.036488 | orchestrator | 2025-09-23 21:08:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:50.039451 | orchestrator | 2025-09-23 21:08:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:50.039489 | orchestrator | 2025-09-23 21:08:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:53.088967 | orchestrator | 2025-09-23 21:08:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:53.092750 | orchestrator | 2025-09-23 21:08:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:53.092938 | orchestrator | 2025-09-23 21:08:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:56.156241 | orchestrator | 2025-09-23 21:08:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:56.157397 | orchestrator | 2025-09-23 21:08:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:56.157440 | orchestrator | 2025-09-23 21:08:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:08:59.209335 | orchestrator | 2025-09-23 21:08:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:08:59.210905 | orchestrator | 2025-09-23 21:08:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:08:59.211544 | orchestrator | 2025-09-23 21:08:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:02.261796 | orchestrator | 2025-09-23 21:09:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:02.264944 | orchestrator | 2025-09-23 21:09:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:02.264979 | orchestrator | 2025-09-23 21:09:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:05.314762 | orchestrator | 2025-09-23 21:09:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:05.315033 | orchestrator | 2025-09-23 21:09:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:05.315373 | orchestrator | 2025-09-23 21:09:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:08.359514 | orchestrator | 2025-09-23 21:09:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:08.360665 | orchestrator | 2025-09-23 21:09:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:08.360778 | orchestrator | 2025-09-23 21:09:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:11.402956 | orchestrator | 2025-09-23 21:09:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:11.403282 | orchestrator | 2025-09-23 21:09:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:11.403312 | orchestrator | 2025-09-23 21:09:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:14.452000 | orchestrator | 2025-09-23 21:09:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:14.453829 | orchestrator | 2025-09-23 21:09:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:14.453866 | orchestrator | 2025-09-23 21:09:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:17.496326 | orchestrator | 2025-09-23 21:09:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:17.497997 | orchestrator | 2025-09-23 21:09:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:17.498089 | orchestrator | 2025-09-23 21:09:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:20.541881 | orchestrator | 2025-09-23 21:09:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:20.543264 | orchestrator | 2025-09-23 21:09:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:20.543298 | orchestrator | 2025-09-23 21:09:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:23.588533 | orchestrator | 2025-09-23 21:09:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:23.590348 | orchestrator | 2025-09-23 21:09:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:23.590527 | orchestrator | 2025-09-23 21:09:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:26.635336 | orchestrator | 2025-09-23 21:09:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:26.637592 | orchestrator | 2025-09-23 21:09:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:26.637620 | orchestrator | 2025-09-23 21:09:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:29.672732 | orchestrator | 2025-09-23 21:09:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:29.674192 | orchestrator | 2025-09-23 21:09:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:29.674364 | orchestrator | 2025-09-23 21:09:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:32.710202 | orchestrator | 2025-09-23 21:09:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:32.712677 | orchestrator | 2025-09-23 21:09:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:32.712733 | orchestrator | 2025-09-23 21:09:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:35.760237 | orchestrator | 2025-09-23 21:09:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:35.762792 | orchestrator | 2025-09-23 21:09:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:35.763128 | orchestrator | 2025-09-23 21:09:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:38.810112 | orchestrator | 2025-09-23 21:09:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:38.811161 | orchestrator | 2025-09-23 21:09:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:38.812497 | orchestrator | 2025-09-23 21:09:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:41.855062 | orchestrator | 2025-09-23 21:09:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:41.857411 | orchestrator | 2025-09-23 21:09:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:41.857662 | orchestrator | 2025-09-23 21:09:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:44.903143 | orchestrator | 2025-09-23 21:09:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:44.904551 | orchestrator | 2025-09-23 21:09:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:44.904583 | orchestrator | 2025-09-23 21:09:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:47.950994 | orchestrator | 2025-09-23 21:09:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:47.951853 | orchestrator | 2025-09-23 21:09:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:47.951903 | orchestrator | 2025-09-23 21:09:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:50.994142 | orchestrator | 2025-09-23 21:09:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:50.995229 | orchestrator | 2025-09-23 21:09:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:50.995294 | orchestrator | 2025-09-23 21:09:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:54.041397 | orchestrator | 2025-09-23 21:09:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:54.043139 | orchestrator | 2025-09-23 21:09:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:54.043219 | orchestrator | 2025-09-23 21:09:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:09:57.085768 | orchestrator | 2025-09-23 21:09:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:09:57.088994 | orchestrator | 2025-09-23 21:09:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:09:57.089034 | orchestrator | 2025-09-23 21:09:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:00.135288 | orchestrator | 2025-09-23 21:10:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:00.137660 | orchestrator | 2025-09-23 21:10:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:00.137694 | orchestrator | 2025-09-23 21:10:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:03.183691 | orchestrator | 2025-09-23 21:10:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:03.185340 | orchestrator | 2025-09-23 21:10:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:03.185362 | orchestrator | 2025-09-23 21:10:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:06.230301 | orchestrator | 2025-09-23 21:10:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:06.231129 | orchestrator | 2025-09-23 21:10:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:06.231367 | orchestrator | 2025-09-23 21:10:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:09.273364 | orchestrator | 2025-09-23 21:10:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:09.274945 | orchestrator | 2025-09-23 21:10:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:09.275060 | orchestrator | 2025-09-23 21:10:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:12.319576 | orchestrator | 2025-09-23 21:10:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:12.321751 | orchestrator | 2025-09-23 21:10:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:12.321803 | orchestrator | 2025-09-23 21:10:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:15.360517 | orchestrator | 2025-09-23 21:10:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:15.362326 | orchestrator | 2025-09-23 21:10:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:15.362465 | orchestrator | 2025-09-23 21:10:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:18.408248 | orchestrator | 2025-09-23 21:10:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:18.410133 | orchestrator | 2025-09-23 21:10:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:18.410170 | orchestrator | 2025-09-23 21:10:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:21.458396 | orchestrator | 2025-09-23 21:10:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:21.461375 | orchestrator | 2025-09-23 21:10:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:21.461457 | orchestrator | 2025-09-23 21:10:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:24.520821 | orchestrator | 2025-09-23 21:10:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:24.522712 | orchestrator | 2025-09-23 21:10:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:24.522765 | orchestrator | 2025-09-23 21:10:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:27.573883 | orchestrator | 2025-09-23 21:10:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:27.577177 | orchestrator | 2025-09-23 21:10:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:27.578207 | orchestrator | 2025-09-23 21:10:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:30.625202 | orchestrator | 2025-09-23 21:10:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:30.626552 | orchestrator | 2025-09-23 21:10:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:30.626584 | orchestrator | 2025-09-23 21:10:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:33.672332 | orchestrator | 2025-09-23 21:10:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:33.674394 | orchestrator | 2025-09-23 21:10:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:33.674447 | orchestrator | 2025-09-23 21:10:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:36.721789 | orchestrator | 2025-09-23 21:10:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:36.723105 | orchestrator | 2025-09-23 21:10:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:36.723141 | orchestrator | 2025-09-23 21:10:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:39.768864 | orchestrator | 2025-09-23 21:10:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:39.770338 | orchestrator | 2025-09-23 21:10:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:39.770375 | orchestrator | 2025-09-23 21:10:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:42.822333 | orchestrator | 2025-09-23 21:10:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:42.823080 | orchestrator | 2025-09-23 21:10:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:42.823530 | orchestrator | 2025-09-23 21:10:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:45.866212 | orchestrator | 2025-09-23 21:10:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:45.867645 | orchestrator | 2025-09-23 21:10:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:45.867688 | orchestrator | 2025-09-23 21:10:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:48.902468 | orchestrator | 2025-09-23 21:10:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:48.904001 | orchestrator | 2025-09-23 21:10:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:48.904046 | orchestrator | 2025-09-23 21:10:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:51.948760 | orchestrator | 2025-09-23 21:10:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:51.950259 | orchestrator | 2025-09-23 21:10:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:51.950337 | orchestrator | 2025-09-23 21:10:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:54.991467 | orchestrator | 2025-09-23 21:10:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:54.993619 | orchestrator | 2025-09-23 21:10:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:54.993804 | orchestrator | 2025-09-23 21:10:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:10:58.045573 | orchestrator | 2025-09-23 21:10:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:10:58.046205 | orchestrator | 2025-09-23 21:10:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:10:58.046416 | orchestrator | 2025-09-23 21:10:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:01.092082 | orchestrator | 2025-09-23 21:11:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:01.093678 | orchestrator | 2025-09-23 21:11:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:01.093893 | orchestrator | 2025-09-23 21:11:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:04.140744 | orchestrator | 2025-09-23 21:11:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:04.143054 | orchestrator | 2025-09-23 21:11:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:04.143138 | orchestrator | 2025-09-23 21:11:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:07.191167 | orchestrator | 2025-09-23 21:11:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:07.192615 | orchestrator | 2025-09-23 21:11:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:07.192647 | orchestrator | 2025-09-23 21:11:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:10.234435 | orchestrator | 2025-09-23 21:11:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:10.236889 | orchestrator | 2025-09-23 21:11:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:10.237260 | orchestrator | 2025-09-23 21:11:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:13.276929 | orchestrator | 2025-09-23 21:11:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:13.277994 | orchestrator | 2025-09-23 21:11:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:13.278073 | orchestrator | 2025-09-23 21:11:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:16.322674 | orchestrator | 2025-09-23 21:11:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:16.323968 | orchestrator | 2025-09-23 21:11:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:16.324006 | orchestrator | 2025-09-23 21:11:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:19.369400 | orchestrator | 2025-09-23 21:11:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:19.371351 | orchestrator | 2025-09-23 21:11:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:19.371384 | orchestrator | 2025-09-23 21:11:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:22.415709 | orchestrator | 2025-09-23 21:11:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:22.418409 | orchestrator | 2025-09-23 21:11:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:22.418458 | orchestrator | 2025-09-23 21:11:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:25.467521 | orchestrator | 2025-09-23 21:11:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:25.468371 | orchestrator | 2025-09-23 21:11:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:25.468510 | orchestrator | 2025-09-23 21:11:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:28.511455 | orchestrator | 2025-09-23 21:11:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:28.512966 | orchestrator | 2025-09-23 21:11:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:28.512993 | orchestrator | 2025-09-23 21:11:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:31.555058 | orchestrator | 2025-09-23 21:11:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:31.556231 | orchestrator | 2025-09-23 21:11:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:31.556306 | orchestrator | 2025-09-23 21:11:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:34.603247 | orchestrator | 2025-09-23 21:11:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:34.604665 | orchestrator | 2025-09-23 21:11:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:34.604802 | orchestrator | 2025-09-23 21:11:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:37.655937 | orchestrator | 2025-09-23 21:11:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:37.657454 | orchestrator | 2025-09-23 21:11:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:37.657484 | orchestrator | 2025-09-23 21:11:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:40.704816 | orchestrator | 2025-09-23 21:11:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:40.705898 | orchestrator | 2025-09-23 21:11:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:40.705937 | orchestrator | 2025-09-23 21:11:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:43.757959 | orchestrator | 2025-09-23 21:11:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:43.759777 | orchestrator | 2025-09-23 21:11:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:43.759832 | orchestrator | 2025-09-23 21:11:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:46.806416 | orchestrator | 2025-09-23 21:11:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:46.808041 | orchestrator | 2025-09-23 21:11:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:46.808112 | orchestrator | 2025-09-23 21:11:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:49.877088 | orchestrator | 2025-09-23 21:11:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:49.877218 | orchestrator | 2025-09-23 21:11:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:49.877231 | orchestrator | 2025-09-23 21:11:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:52.925667 | orchestrator | 2025-09-23 21:11:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:52.928069 | orchestrator | 2025-09-23 21:11:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:52.928096 | orchestrator | 2025-09-23 21:11:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:55.969698 | orchestrator | 2025-09-23 21:11:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:55.971127 | orchestrator | 2025-09-23 21:11:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:55.971395 | orchestrator | 2025-09-23 21:11:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:11:59.013774 | orchestrator | 2025-09-23 21:11:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:11:59.015649 | orchestrator | 2025-09-23 21:11:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:11:59.016143 | orchestrator | 2025-09-23 21:11:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:02.057052 | orchestrator | 2025-09-23 21:12:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:02.058246 | orchestrator | 2025-09-23 21:12:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:02.058281 | orchestrator | 2025-09-23 21:12:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:05.101336 | orchestrator | 2025-09-23 21:12:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:05.103926 | orchestrator | 2025-09-23 21:12:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:05.104419 | orchestrator | 2025-09-23 21:12:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:08.147371 | orchestrator | 2025-09-23 21:12:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:08.151134 | orchestrator | 2025-09-23 21:12:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:08.151169 | orchestrator | 2025-09-23 21:12:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:11.200354 | orchestrator | 2025-09-23 21:12:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:11.202174 | orchestrator | 2025-09-23 21:12:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:11.202218 | orchestrator | 2025-09-23 21:12:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:14.245434 | orchestrator | 2025-09-23 21:12:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:14.247531 | orchestrator | 2025-09-23 21:12:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:14.247788 | orchestrator | 2025-09-23 21:12:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:17.289625 | orchestrator | 2025-09-23 21:12:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:17.291968 | orchestrator | 2025-09-23 21:12:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:17.292002 | orchestrator | 2025-09-23 21:12:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:20.337935 | orchestrator | 2025-09-23 21:12:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:20.340519 | orchestrator | 2025-09-23 21:12:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:20.340551 | orchestrator | 2025-09-23 21:12:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:23.385634 | orchestrator | 2025-09-23 21:12:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:23.387057 | orchestrator | 2025-09-23 21:12:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:23.387091 | orchestrator | 2025-09-23 21:12:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:26.427892 | orchestrator | 2025-09-23 21:12:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:26.430137 | orchestrator | 2025-09-23 21:12:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:26.430651 | orchestrator | 2025-09-23 21:12:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:29.475208 | orchestrator | 2025-09-23 21:12:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:29.476973 | orchestrator | 2025-09-23 21:12:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:29.477003 | orchestrator | 2025-09-23 21:12:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:32.515900 | orchestrator | 2025-09-23 21:12:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:32.518187 | orchestrator | 2025-09-23 21:12:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:32.518226 | orchestrator | 2025-09-23 21:12:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:35.564267 | orchestrator | 2025-09-23 21:12:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:35.566362 | orchestrator | 2025-09-23 21:12:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:35.566596 | orchestrator | 2025-09-23 21:12:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:38.614418 | orchestrator | 2025-09-23 21:12:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:38.617072 | orchestrator | 2025-09-23 21:12:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:38.617129 | orchestrator | 2025-09-23 21:12:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:41.663694 | orchestrator | 2025-09-23 21:12:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:41.665111 | orchestrator | 2025-09-23 21:12:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:41.665168 | orchestrator | 2025-09-23 21:12:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:44.708282 | orchestrator | 2025-09-23 21:12:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:44.709548 | orchestrator | 2025-09-23 21:12:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:44.709582 | orchestrator | 2025-09-23 21:12:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:47.753977 | orchestrator | 2025-09-23 21:12:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:47.756258 | orchestrator | 2025-09-23 21:12:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:47.756299 | orchestrator | 2025-09-23 21:12:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:50.802077 | orchestrator | 2025-09-23 21:12:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:50.803600 | orchestrator | 2025-09-23 21:12:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:50.803844 | orchestrator | 2025-09-23 21:12:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:53.847473 | orchestrator | 2025-09-23 21:12:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:53.848444 | orchestrator | 2025-09-23 21:12:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:53.848494 | orchestrator | 2025-09-23 21:12:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:56.899198 | orchestrator | 2025-09-23 21:12:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:56.901244 | orchestrator | 2025-09-23 21:12:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:56.901342 | orchestrator | 2025-09-23 21:12:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:12:59.952991 | orchestrator | 2025-09-23 21:12:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:12:59.955201 | orchestrator | 2025-09-23 21:12:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:12:59.955258 | orchestrator | 2025-09-23 21:12:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:02.998918 | orchestrator | 2025-09-23 21:13:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:03.000672 | orchestrator | 2025-09-23 21:13:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:03.000711 | orchestrator | 2025-09-23 21:13:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:06.047360 | orchestrator | 2025-09-23 21:13:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:06.048166 | orchestrator | 2025-09-23 21:13:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:06.048200 | orchestrator | 2025-09-23 21:13:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:09.093174 | orchestrator | 2025-09-23 21:13:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:09.095784 | orchestrator | 2025-09-23 21:13:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:09.095856 | orchestrator | 2025-09-23 21:13:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:12.143220 | orchestrator | 2025-09-23 21:13:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:12.144888 | orchestrator | 2025-09-23 21:13:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:12.145323 | orchestrator | 2025-09-23 21:13:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:15.188891 | orchestrator | 2025-09-23 21:13:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:15.190664 | orchestrator | 2025-09-23 21:13:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:15.190740 | orchestrator | 2025-09-23 21:13:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:18.236817 | orchestrator | 2025-09-23 21:13:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:18.238659 | orchestrator | 2025-09-23 21:13:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:18.238696 | orchestrator | 2025-09-23 21:13:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:21.282592 | orchestrator | 2025-09-23 21:13:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:21.284605 | orchestrator | 2025-09-23 21:13:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:21.284701 | orchestrator | 2025-09-23 21:13:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:24.326612 | orchestrator | 2025-09-23 21:13:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:24.328783 | orchestrator | 2025-09-23 21:13:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:24.328816 | orchestrator | 2025-09-23 21:13:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:27.375495 | orchestrator | 2025-09-23 21:13:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:27.377709 | orchestrator | 2025-09-23 21:13:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:27.377785 | orchestrator | 2025-09-23 21:13:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:30.417286 | orchestrator | 2025-09-23 21:13:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:30.419415 | orchestrator | 2025-09-23 21:13:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:30.419754 | orchestrator | 2025-09-23 21:13:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:33.464396 | orchestrator | 2025-09-23 21:13:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:33.465636 | orchestrator | 2025-09-23 21:13:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:33.465750 | orchestrator | 2025-09-23 21:13:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:36.517800 | orchestrator | 2025-09-23 21:13:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:36.519193 | orchestrator | 2025-09-23 21:13:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:36.519291 | orchestrator | 2025-09-23 21:13:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:39.564609 | orchestrator | 2025-09-23 21:13:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:39.568072 | orchestrator | 2025-09-23 21:13:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:39.568106 | orchestrator | 2025-09-23 21:13:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:42.618480 | orchestrator | 2025-09-23 21:13:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:42.619389 | orchestrator | 2025-09-23 21:13:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:42.619424 | orchestrator | 2025-09-23 21:13:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:45.668439 | orchestrator | 2025-09-23 21:13:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:45.671130 | orchestrator | 2025-09-23 21:13:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:45.671382 | orchestrator | 2025-09-23 21:13:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:48.711191 | orchestrator | 2025-09-23 21:13:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:48.712961 | orchestrator | 2025-09-23 21:13:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:48.712994 | orchestrator | 2025-09-23 21:13:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:51.755692 | orchestrator | 2025-09-23 21:13:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:51.756894 | orchestrator | 2025-09-23 21:13:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:51.756923 | orchestrator | 2025-09-23 21:13:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:54.802802 | orchestrator | 2025-09-23 21:13:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:54.804155 | orchestrator | 2025-09-23 21:13:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:54.804191 | orchestrator | 2025-09-23 21:13:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:13:57.849921 | orchestrator | 2025-09-23 21:13:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:13:57.851110 | orchestrator | 2025-09-23 21:13:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:13:57.851160 | orchestrator | 2025-09-23 21:13:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:00.895468 | orchestrator | 2025-09-23 21:14:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:00.898526 | orchestrator | 2025-09-23 21:14:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:00.898652 | orchestrator | 2025-09-23 21:14:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:03.941400 | orchestrator | 2025-09-23 21:14:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:03.942874 | orchestrator | 2025-09-23 21:14:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:03.943170 | orchestrator | 2025-09-23 21:14:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:06.985726 | orchestrator | 2025-09-23 21:14:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:06.986810 | orchestrator | 2025-09-23 21:14:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:06.986892 | orchestrator | 2025-09-23 21:14:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:10.032426 | orchestrator | 2025-09-23 21:14:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:10.034205 | orchestrator | 2025-09-23 21:14:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:10.034291 | orchestrator | 2025-09-23 21:14:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:13.076724 | orchestrator | 2025-09-23 21:14:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:13.078438 | orchestrator | 2025-09-23 21:14:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:13.078474 | orchestrator | 2025-09-23 21:14:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:16.126168 | orchestrator | 2025-09-23 21:14:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:16.128346 | orchestrator | 2025-09-23 21:14:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:16.128401 | orchestrator | 2025-09-23 21:14:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:19.169849 | orchestrator | 2025-09-23 21:14:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:19.171741 | orchestrator | 2025-09-23 21:14:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:19.171851 | orchestrator | 2025-09-23 21:14:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:22.217694 | orchestrator | 2025-09-23 21:14:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:22.219370 | orchestrator | 2025-09-23 21:14:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:22.219406 | orchestrator | 2025-09-23 21:14:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:25.259314 | orchestrator | 2025-09-23 21:14:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:25.259788 | orchestrator | 2025-09-23 21:14:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:25.259856 | orchestrator | 2025-09-23 21:14:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:28.306269 | orchestrator | 2025-09-23 21:14:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:28.307131 | orchestrator | 2025-09-23 21:14:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:28.307166 | orchestrator | 2025-09-23 21:14:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:31.346286 | orchestrator | 2025-09-23 21:14:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:31.347678 | orchestrator | 2025-09-23 21:14:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:31.347713 | orchestrator | 2025-09-23 21:14:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:34.396229 | orchestrator | 2025-09-23 21:14:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:34.398399 | orchestrator | 2025-09-23 21:14:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:34.398540 | orchestrator | 2025-09-23 21:14:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:37.444179 | orchestrator | 2025-09-23 21:14:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:37.446239 | orchestrator | 2025-09-23 21:14:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:37.446287 | orchestrator | 2025-09-23 21:14:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:40.491532 | orchestrator | 2025-09-23 21:14:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:40.493042 | orchestrator | 2025-09-23 21:14:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:40.493429 | orchestrator | 2025-09-23 21:14:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:43.532805 | orchestrator | 2025-09-23 21:14:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:43.533983 | orchestrator | 2025-09-23 21:14:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:43.534059 | orchestrator | 2025-09-23 21:14:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:46.577240 | orchestrator | 2025-09-23 21:14:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:46.578526 | orchestrator | 2025-09-23 21:14:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:46.578575 | orchestrator | 2025-09-23 21:14:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:49.620238 | orchestrator | 2025-09-23 21:14:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:49.620829 | orchestrator | 2025-09-23 21:14:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:49.620905 | orchestrator | 2025-09-23 21:14:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:52.664144 | orchestrator | 2025-09-23 21:14:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:52.666651 | orchestrator | 2025-09-23 21:14:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:52.666716 | orchestrator | 2025-09-23 21:14:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:55.705945 | orchestrator | 2025-09-23 21:14:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:55.708319 | orchestrator | 2025-09-23 21:14:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:55.708365 | orchestrator | 2025-09-23 21:14:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:14:58.750519 | orchestrator | 2025-09-23 21:14:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:14:58.752026 | orchestrator | 2025-09-23 21:14:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:14:58.752226 | orchestrator | 2025-09-23 21:14:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:01.799570 | orchestrator | 2025-09-23 21:15:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:01.801101 | orchestrator | 2025-09-23 21:15:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:01.801133 | orchestrator | 2025-09-23 21:15:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:04.847599 | orchestrator | 2025-09-23 21:15:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:04.848913 | orchestrator | 2025-09-23 21:15:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:04.848954 | orchestrator | 2025-09-23 21:15:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:07.893438 | orchestrator | 2025-09-23 21:15:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:07.895454 | orchestrator | 2025-09-23 21:15:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:07.895653 | orchestrator | 2025-09-23 21:15:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:10.938305 | orchestrator | 2025-09-23 21:15:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:10.940259 | orchestrator | 2025-09-23 21:15:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:10.940384 | orchestrator | 2025-09-23 21:15:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:13.983444 | orchestrator | 2025-09-23 21:15:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:13.984075 | orchestrator | 2025-09-23 21:15:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:13.984109 | orchestrator | 2025-09-23 21:15:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:17.035141 | orchestrator | 2025-09-23 21:15:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:17.037381 | orchestrator | 2025-09-23 21:15:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:17.037468 | orchestrator | 2025-09-23 21:15:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:20.083881 | orchestrator | 2025-09-23 21:15:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:20.085209 | orchestrator | 2025-09-23 21:15:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:20.085299 | orchestrator | 2025-09-23 21:15:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:23.134466 | orchestrator | 2025-09-23 21:15:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:23.135621 | orchestrator | 2025-09-23 21:15:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:23.135654 | orchestrator | 2025-09-23 21:15:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:26.183918 | orchestrator | 2025-09-23 21:15:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:26.185301 | orchestrator | 2025-09-23 21:15:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:26.185361 | orchestrator | 2025-09-23 21:15:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:29.233136 | orchestrator | 2025-09-23 21:15:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:29.233710 | orchestrator | 2025-09-23 21:15:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:29.233783 | orchestrator | 2025-09-23 21:15:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:32.284836 | orchestrator | 2025-09-23 21:15:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:32.287469 | orchestrator | 2025-09-23 21:15:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:32.287533 | orchestrator | 2025-09-23 21:15:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:35.327559 | orchestrator | 2025-09-23 21:15:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:35.327916 | orchestrator | 2025-09-23 21:15:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:35.327960 | orchestrator | 2025-09-23 21:15:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:38.377156 | orchestrator | 2025-09-23 21:15:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:38.378657 | orchestrator | 2025-09-23 21:15:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:38.378699 | orchestrator | 2025-09-23 21:15:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:41.424274 | orchestrator | 2025-09-23 21:15:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:41.425997 | orchestrator | 2025-09-23 21:15:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:41.426106 | orchestrator | 2025-09-23 21:15:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:44.473007 | orchestrator | 2025-09-23 21:15:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:44.474994 | orchestrator | 2025-09-23 21:15:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:44.475043 | orchestrator | 2025-09-23 21:15:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:47.518524 | orchestrator | 2025-09-23 21:15:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:47.519567 | orchestrator | 2025-09-23 21:15:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:47.519827 | orchestrator | 2025-09-23 21:15:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:50.554646 | orchestrator | 2025-09-23 21:15:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:50.555129 | orchestrator | 2025-09-23 21:15:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:50.555160 | orchestrator | 2025-09-23 21:15:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:53.606310 | orchestrator | 2025-09-23 21:15:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:53.608004 | orchestrator | 2025-09-23 21:15:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:53.608105 | orchestrator | 2025-09-23 21:15:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:56.655373 | orchestrator | 2025-09-23 21:15:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:56.657380 | orchestrator | 2025-09-23 21:15:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:56.657938 | orchestrator | 2025-09-23 21:15:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:15:59.703142 | orchestrator | 2025-09-23 21:15:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:15:59.705223 | orchestrator | 2025-09-23 21:15:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:15:59.705273 | orchestrator | 2025-09-23 21:15:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:02.752725 | orchestrator | 2025-09-23 21:16:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:02.754693 | orchestrator | 2025-09-23 21:16:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:02.754922 | orchestrator | 2025-09-23 21:16:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:05.805177 | orchestrator | 2025-09-23 21:16:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:05.807029 | orchestrator | 2025-09-23 21:16:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:05.807066 | orchestrator | 2025-09-23 21:16:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:08.853463 | orchestrator | 2025-09-23 21:16:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:08.855492 | orchestrator | 2025-09-23 21:16:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:08.855528 | orchestrator | 2025-09-23 21:16:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:11.905502 | orchestrator | 2025-09-23 21:16:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:11.907091 | orchestrator | 2025-09-23 21:16:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:11.907301 | orchestrator | 2025-09-23 21:16:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:14.956054 | orchestrator | 2025-09-23 21:16:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:14.956981 | orchestrator | 2025-09-23 21:16:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:14.957041 | orchestrator | 2025-09-23 21:16:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:17.996510 | orchestrator | 2025-09-23 21:16:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:17.998262 | orchestrator | 2025-09-23 21:16:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:17.998502 | orchestrator | 2025-09-23 21:16:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:21.040431 | orchestrator | 2025-09-23 21:16:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:21.042719 | orchestrator | 2025-09-23 21:16:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:21.042804 | orchestrator | 2025-09-23 21:16:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:24.089963 | orchestrator | 2025-09-23 21:16:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:24.090965 | orchestrator | 2025-09-23 21:16:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:24.091053 | orchestrator | 2025-09-23 21:16:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:27.134859 | orchestrator | 2025-09-23 21:16:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:27.136268 | orchestrator | 2025-09-23 21:16:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:27.136301 | orchestrator | 2025-09-23 21:16:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:30.184880 | orchestrator | 2025-09-23 21:16:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:30.186258 | orchestrator | 2025-09-23 21:16:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:30.186300 | orchestrator | 2025-09-23 21:16:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:33.229110 | orchestrator | 2025-09-23 21:16:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:33.230710 | orchestrator | 2025-09-23 21:16:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:33.230797 | orchestrator | 2025-09-23 21:16:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:36.275946 | orchestrator | 2025-09-23 21:16:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:36.277996 | orchestrator | 2025-09-23 21:16:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:36.278168 | orchestrator | 2025-09-23 21:16:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:39.320183 | orchestrator | 2025-09-23 21:16:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:39.322014 | orchestrator | 2025-09-23 21:16:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:39.322119 | orchestrator | 2025-09-23 21:16:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:42.361473 | orchestrator | 2025-09-23 21:16:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:42.363768 | orchestrator | 2025-09-23 21:16:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:42.364082 | orchestrator | 2025-09-23 21:16:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:45.413589 | orchestrator | 2025-09-23 21:16:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:45.414673 | orchestrator | 2025-09-23 21:16:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:45.414922 | orchestrator | 2025-09-23 21:16:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:48.450245 | orchestrator | 2025-09-23 21:16:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:48.451415 | orchestrator | 2025-09-23 21:16:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:48.451449 | orchestrator | 2025-09-23 21:16:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:51.493838 | orchestrator | 2025-09-23 21:16:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:51.495100 | orchestrator | 2025-09-23 21:16:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:51.495139 | orchestrator | 2025-09-23 21:16:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:54.539706 | orchestrator | 2025-09-23 21:16:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:54.540932 | orchestrator | 2025-09-23 21:16:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:54.541138 | orchestrator | 2025-09-23 21:16:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:16:57.582815 | orchestrator | 2025-09-23 21:16:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:16:57.584308 | orchestrator | 2025-09-23 21:16:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:16:57.584350 | orchestrator | 2025-09-23 21:16:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:00.630476 | orchestrator | 2025-09-23 21:17:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:00.632215 | orchestrator | 2025-09-23 21:17:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:00.632248 | orchestrator | 2025-09-23 21:17:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:03.678326 | orchestrator | 2025-09-23 21:17:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:03.680031 | orchestrator | 2025-09-23 21:17:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:03.680064 | orchestrator | 2025-09-23 21:17:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:06.724656 | orchestrator | 2025-09-23 21:17:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:06.726370 | orchestrator | 2025-09-23 21:17:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:06.726457 | orchestrator | 2025-09-23 21:17:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:09.776343 | orchestrator | 2025-09-23 21:17:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:09.778725 | orchestrator | 2025-09-23 21:17:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:09.778781 | orchestrator | 2025-09-23 21:17:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:12.820969 | orchestrator | 2025-09-23 21:17:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:12.823083 | orchestrator | 2025-09-23 21:17:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:12.823117 | orchestrator | 2025-09-23 21:17:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:15.868854 | orchestrator | 2025-09-23 21:17:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:15.870719 | orchestrator | 2025-09-23 21:17:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:15.870756 | orchestrator | 2025-09-23 21:17:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:18.911192 | orchestrator | 2025-09-23 21:17:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:18.913502 | orchestrator | 2025-09-23 21:17:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:18.913533 | orchestrator | 2025-09-23 21:17:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:21.953932 | orchestrator | 2025-09-23 21:17:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:21.955636 | orchestrator | 2025-09-23 21:17:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:21.955702 | orchestrator | 2025-09-23 21:17:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:25.005559 | orchestrator | 2025-09-23 21:17:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:25.007489 | orchestrator | 2025-09-23 21:17:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:25.007579 | orchestrator | 2025-09-23 21:17:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:28.054647 | orchestrator | 2025-09-23 21:17:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:28.056985 | orchestrator | 2025-09-23 21:17:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:28.057018 | orchestrator | 2025-09-23 21:17:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:31.102589 | orchestrator | 2025-09-23 21:17:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:31.104520 | orchestrator | 2025-09-23 21:17:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:31.104555 | orchestrator | 2025-09-23 21:17:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:34.144742 | orchestrator | 2025-09-23 21:17:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:34.147129 | orchestrator | 2025-09-23 21:17:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:34.147225 | orchestrator | 2025-09-23 21:17:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:37.190116 | orchestrator | 2025-09-23 21:17:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:37.191918 | orchestrator | 2025-09-23 21:17:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:37.192020 | orchestrator | 2025-09-23 21:17:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:40.243322 | orchestrator | 2025-09-23 21:17:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:40.245329 | orchestrator | 2025-09-23 21:17:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:40.245368 | orchestrator | 2025-09-23 21:17:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:43.289057 | orchestrator | 2025-09-23 21:17:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:43.290184 | orchestrator | 2025-09-23 21:17:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:43.290217 | orchestrator | 2025-09-23 21:17:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:46.336024 | orchestrator | 2025-09-23 21:17:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:46.337610 | orchestrator | 2025-09-23 21:17:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:46.337642 | orchestrator | 2025-09-23 21:17:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:49.376094 | orchestrator | 2025-09-23 21:17:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:49.377343 | orchestrator | 2025-09-23 21:17:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:49.377857 | orchestrator | 2025-09-23 21:17:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:52.423493 | orchestrator | 2025-09-23 21:17:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:52.424655 | orchestrator | 2025-09-23 21:17:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:52.424694 | orchestrator | 2025-09-23 21:17:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:55.468497 | orchestrator | 2025-09-23 21:17:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:55.470395 | orchestrator | 2025-09-23 21:17:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:55.470513 | orchestrator | 2025-09-23 21:17:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:17:58.519566 | orchestrator | 2025-09-23 21:17:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:17:58.520843 | orchestrator | 2025-09-23 21:17:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:17:58.520876 | orchestrator | 2025-09-23 21:17:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:01.568744 | orchestrator | 2025-09-23 21:18:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:01.570011 | orchestrator | 2025-09-23 21:18:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:01.570088 | orchestrator | 2025-09-23 21:18:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:04.610260 | orchestrator | 2025-09-23 21:18:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:04.612138 | orchestrator | 2025-09-23 21:18:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:04.612191 | orchestrator | 2025-09-23 21:18:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:07.653755 | orchestrator | 2025-09-23 21:18:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:07.654931 | orchestrator | 2025-09-23 21:18:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:07.655245 | orchestrator | 2025-09-23 21:18:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:10.701457 | orchestrator | 2025-09-23 21:18:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:10.703026 | orchestrator | 2025-09-23 21:18:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:10.703061 | orchestrator | 2025-09-23 21:18:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:13.753444 | orchestrator | 2025-09-23 21:18:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:13.753931 | orchestrator | 2025-09-23 21:18:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:13.754198 | orchestrator | 2025-09-23 21:18:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:16.804065 | orchestrator | 2025-09-23 21:18:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:16.806901 | orchestrator | 2025-09-23 21:18:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:16.807504 | orchestrator | 2025-09-23 21:18:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:19.848475 | orchestrator | 2025-09-23 21:18:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:19.849865 | orchestrator | 2025-09-23 21:18:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:19.849899 | orchestrator | 2025-09-23 21:18:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:22.894385 | orchestrator | 2025-09-23 21:18:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:22.894641 | orchestrator | 2025-09-23 21:18:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:22.894662 | orchestrator | 2025-09-23 21:18:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:25.934196 | orchestrator | 2025-09-23 21:18:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:25.935194 | orchestrator | 2025-09-23 21:18:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:25.935229 | orchestrator | 2025-09-23 21:18:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:28.982338 | orchestrator | 2025-09-23 21:18:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:28.983913 | orchestrator | 2025-09-23 21:18:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:28.983950 | orchestrator | 2025-09-23 21:18:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:32.032422 | orchestrator | 2025-09-23 21:18:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:32.033528 | orchestrator | 2025-09-23 21:18:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:32.033563 | orchestrator | 2025-09-23 21:18:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:35.077815 | orchestrator | 2025-09-23 21:18:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:35.079174 | orchestrator | 2025-09-23 21:18:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:35.079428 | orchestrator | 2025-09-23 21:18:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:38.121998 | orchestrator | 2025-09-23 21:18:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:38.124434 | orchestrator | 2025-09-23 21:18:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:38.124710 | orchestrator | 2025-09-23 21:18:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:41.171070 | orchestrator | 2025-09-23 21:18:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:41.171985 | orchestrator | 2025-09-23 21:18:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:41.172019 | orchestrator | 2025-09-23 21:18:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:44.218375 | orchestrator | 2025-09-23 21:18:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:44.220178 | orchestrator | 2025-09-23 21:18:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:44.220213 | orchestrator | 2025-09-23 21:18:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:47.258200 | orchestrator | 2025-09-23 21:18:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:47.259488 | orchestrator | 2025-09-23 21:18:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:47.259516 | orchestrator | 2025-09-23 21:18:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:50.307504 | orchestrator | 2025-09-23 21:18:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:50.308288 | orchestrator | 2025-09-23 21:18:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:50.308830 | orchestrator | 2025-09-23 21:18:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:53.351974 | orchestrator | 2025-09-23 21:18:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:53.354550 | orchestrator | 2025-09-23 21:18:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:53.354812 | orchestrator | 2025-09-23 21:18:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:56.396723 | orchestrator | 2025-09-23 21:18:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:56.398468 | orchestrator | 2025-09-23 21:18:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:56.398778 | orchestrator | 2025-09-23 21:18:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:18:59.445507 | orchestrator | 2025-09-23 21:18:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:18:59.447556 | orchestrator | 2025-09-23 21:18:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:18:59.447591 | orchestrator | 2025-09-23 21:18:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:02.487563 | orchestrator | 2025-09-23 21:19:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:02.488158 | orchestrator | 2025-09-23 21:19:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:02.488187 | orchestrator | 2025-09-23 21:19:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:05.533561 | orchestrator | 2025-09-23 21:19:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:05.533868 | orchestrator | 2025-09-23 21:19:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:05.533904 | orchestrator | 2025-09-23 21:19:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:08.580635 | orchestrator | 2025-09-23 21:19:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:08.581797 | orchestrator | 2025-09-23 21:19:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:08.581840 | orchestrator | 2025-09-23 21:19:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:11.626528 | orchestrator | 2025-09-23 21:19:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:11.629007 | orchestrator | 2025-09-23 21:19:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:11.629059 | orchestrator | 2025-09-23 21:19:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:14.672608 | orchestrator | 2025-09-23 21:19:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:14.674519 | orchestrator | 2025-09-23 21:19:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:14.674555 | orchestrator | 2025-09-23 21:19:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:17.718613 | orchestrator | 2025-09-23 21:19:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:17.720929 | orchestrator | 2025-09-23 21:19:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:17.720973 | orchestrator | 2025-09-23 21:19:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:20.768568 | orchestrator | 2025-09-23 21:19:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:20.770517 | orchestrator | 2025-09-23 21:19:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:20.770714 | orchestrator | 2025-09-23 21:19:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:23.810725 | orchestrator | 2025-09-23 21:19:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:23.813016 | orchestrator | 2025-09-23 21:19:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:23.813122 | orchestrator | 2025-09-23 21:19:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:26.852737 | orchestrator | 2025-09-23 21:19:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:26.853941 | orchestrator | 2025-09-23 21:19:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:26.854589 | orchestrator | 2025-09-23 21:19:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:29.900482 | orchestrator | 2025-09-23 21:19:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:29.901968 | orchestrator | 2025-09-23 21:19:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:29.902006 | orchestrator | 2025-09-23 21:19:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:32.945785 | orchestrator | 2025-09-23 21:19:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:32.949757 | orchestrator | 2025-09-23 21:19:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:32.949804 | orchestrator | 2025-09-23 21:19:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:36.000049 | orchestrator | 2025-09-23 21:19:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:36.002176 | orchestrator | 2025-09-23 21:19:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:36.002368 | orchestrator | 2025-09-23 21:19:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:39.048906 | orchestrator | 2025-09-23 21:19:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:39.049867 | orchestrator | 2025-09-23 21:19:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:39.049898 | orchestrator | 2025-09-23 21:19:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:42.087816 | orchestrator | 2025-09-23 21:19:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:42.089402 | orchestrator | 2025-09-23 21:19:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:42.089486 | orchestrator | 2025-09-23 21:19:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:45.136349 | orchestrator | 2025-09-23 21:19:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:45.137983 | orchestrator | 2025-09-23 21:19:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:45.138177 | orchestrator | 2025-09-23 21:19:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:48.184064 | orchestrator | 2025-09-23 21:19:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:48.186655 | orchestrator | 2025-09-23 21:19:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:48.186773 | orchestrator | 2025-09-23 21:19:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:51.234162 | orchestrator | 2025-09-23 21:19:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:51.235870 | orchestrator | 2025-09-23 21:19:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:51.236023 | orchestrator | 2025-09-23 21:19:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:54.280251 | orchestrator | 2025-09-23 21:19:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:54.282247 | orchestrator | 2025-09-23 21:19:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:54.282298 | orchestrator | 2025-09-23 21:19:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:19:57.331434 | orchestrator | 2025-09-23 21:19:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:19:57.333210 | orchestrator | 2025-09-23 21:19:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:19:57.333242 | orchestrator | 2025-09-23 21:19:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:00.380166 | orchestrator | 2025-09-23 21:20:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:00.381154 | orchestrator | 2025-09-23 21:20:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:00.381200 | orchestrator | 2025-09-23 21:20:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:03.427810 | orchestrator | 2025-09-23 21:20:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:03.429162 | orchestrator | 2025-09-23 21:20:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:03.429188 | orchestrator | 2025-09-23 21:20:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:06.472876 | orchestrator | 2025-09-23 21:20:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:06.474740 | orchestrator | 2025-09-23 21:20:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:06.474837 | orchestrator | 2025-09-23 21:20:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:09.516818 | orchestrator | 2025-09-23 21:20:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:09.518341 | orchestrator | 2025-09-23 21:20:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:09.518837 | orchestrator | 2025-09-23 21:20:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:12.555496 | orchestrator | 2025-09-23 21:20:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:12.556086 | orchestrator | 2025-09-23 21:20:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:12.556116 | orchestrator | 2025-09-23 21:20:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:15.603710 | orchestrator | 2025-09-23 21:20:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:15.605670 | orchestrator | 2025-09-23 21:20:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:15.605725 | orchestrator | 2025-09-23 21:20:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:18.652031 | orchestrator | 2025-09-23 21:20:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:18.653187 | orchestrator | 2025-09-23 21:20:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:18.653253 | orchestrator | 2025-09-23 21:20:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:21.708218 | orchestrator | 2025-09-23 21:20:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:21.710577 | orchestrator | 2025-09-23 21:20:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:21.710660 | orchestrator | 2025-09-23 21:20:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:24.757434 | orchestrator | 2025-09-23 21:20:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:24.759176 | orchestrator | 2025-09-23 21:20:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:24.759239 | orchestrator | 2025-09-23 21:20:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:27.811560 | orchestrator | 2025-09-23 21:20:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:27.813215 | orchestrator | 2025-09-23 21:20:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:27.813515 | orchestrator | 2025-09-23 21:20:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:30.861900 | orchestrator | 2025-09-23 21:20:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:30.863803 | orchestrator | 2025-09-23 21:20:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:30.863833 | orchestrator | 2025-09-23 21:20:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:33.906196 | orchestrator | 2025-09-23 21:20:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:33.907832 | orchestrator | 2025-09-23 21:20:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:33.907965 | orchestrator | 2025-09-23 21:20:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:36.956111 | orchestrator | 2025-09-23 21:20:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:36.957599 | orchestrator | 2025-09-23 21:20:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:36.957625 | orchestrator | 2025-09-23 21:20:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:40.001382 | orchestrator | 2025-09-23 21:20:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:40.003604 | orchestrator | 2025-09-23 21:20:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:40.003716 | orchestrator | 2025-09-23 21:20:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:43.050387 | orchestrator | 2025-09-23 21:20:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:43.052578 | orchestrator | 2025-09-23 21:20:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:43.052632 | orchestrator | 2025-09-23 21:20:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:46.098313 | orchestrator | 2025-09-23 21:20:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:46.100104 | orchestrator | 2025-09-23 21:20:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:46.100184 | orchestrator | 2025-09-23 21:20:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:49.142880 | orchestrator | 2025-09-23 21:20:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:49.144180 | orchestrator | 2025-09-23 21:20:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:49.144577 | orchestrator | 2025-09-23 21:20:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:52.197584 | orchestrator | 2025-09-23 21:20:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:52.198967 | orchestrator | 2025-09-23 21:20:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:52.199045 | orchestrator | 2025-09-23 21:20:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:55.244295 | orchestrator | 2025-09-23 21:20:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:55.245501 | orchestrator | 2025-09-23 21:20:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:55.245544 | orchestrator | 2025-09-23 21:20:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:20:58.290353 | orchestrator | 2025-09-23 21:20:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:20:58.292357 | orchestrator | 2025-09-23 21:20:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:20:58.292385 | orchestrator | 2025-09-23 21:20:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:01.336312 | orchestrator | 2025-09-23 21:21:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:01.337213 | orchestrator | 2025-09-23 21:21:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:01.337245 | orchestrator | 2025-09-23 21:21:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:04.384466 | orchestrator | 2025-09-23 21:21:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:04.385522 | orchestrator | 2025-09-23 21:21:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:04.385541 | orchestrator | 2025-09-23 21:21:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:07.431388 | orchestrator | 2025-09-23 21:21:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:07.432206 | orchestrator | 2025-09-23 21:21:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:07.432231 | orchestrator | 2025-09-23 21:21:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:10.472046 | orchestrator | 2025-09-23 21:21:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:10.474149 | orchestrator | 2025-09-23 21:21:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:10.474375 | orchestrator | 2025-09-23 21:21:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:13.521492 | orchestrator | 2025-09-23 21:21:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:13.524486 | orchestrator | 2025-09-23 21:21:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:13.524530 | orchestrator | 2025-09-23 21:21:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:16.575161 | orchestrator | 2025-09-23 21:21:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:16.577851 | orchestrator | 2025-09-23 21:21:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:16.577980 | orchestrator | 2025-09-23 21:21:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:19.626917 | orchestrator | 2025-09-23 21:21:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:19.628974 | orchestrator | 2025-09-23 21:21:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:19.629009 | orchestrator | 2025-09-23 21:21:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:22.680988 | orchestrator | 2025-09-23 21:21:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:22.682269 | orchestrator | 2025-09-23 21:21:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:22.682733 | orchestrator | 2025-09-23 21:21:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:25.723732 | orchestrator | 2025-09-23 21:21:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:25.725822 | orchestrator | 2025-09-23 21:21:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:25.726098 | orchestrator | 2025-09-23 21:21:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:28.770515 | orchestrator | 2025-09-23 21:21:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:28.772032 | orchestrator | 2025-09-23 21:21:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:28.772136 | orchestrator | 2025-09-23 21:21:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:31.814787 | orchestrator | 2025-09-23 21:21:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:31.816612 | orchestrator | 2025-09-23 21:21:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:31.816760 | orchestrator | 2025-09-23 21:21:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:34.865163 | orchestrator | 2025-09-23 21:21:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:34.866748 | orchestrator | 2025-09-23 21:21:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:34.866779 | orchestrator | 2025-09-23 21:21:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:37.912648 | orchestrator | 2025-09-23 21:21:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:37.914129 | orchestrator | 2025-09-23 21:21:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:37.914160 | orchestrator | 2025-09-23 21:21:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:40.955780 | orchestrator | 2025-09-23 21:21:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:40.957212 | orchestrator | 2025-09-23 21:21:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:40.957307 | orchestrator | 2025-09-23 21:21:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:44.001887 | orchestrator | 2025-09-23 21:21:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:44.003245 | orchestrator | 2025-09-23 21:21:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:44.003294 | orchestrator | 2025-09-23 21:21:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:47.039215 | orchestrator | 2025-09-23 21:21:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:47.040783 | orchestrator | 2025-09-23 21:21:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:47.040843 | orchestrator | 2025-09-23 21:21:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:50.079288 | orchestrator | 2025-09-23 21:21:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:50.081032 | orchestrator | 2025-09-23 21:21:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:50.081260 | orchestrator | 2025-09-23 21:21:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:53.127229 | orchestrator | 2025-09-23 21:21:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:53.129090 | orchestrator | 2025-09-23 21:21:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:53.129126 | orchestrator | 2025-09-23 21:21:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:56.179565 | orchestrator | 2025-09-23 21:21:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:56.182444 | orchestrator | 2025-09-23 21:21:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:56.182592 | orchestrator | 2025-09-23 21:21:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:21:59.232001 | orchestrator | 2025-09-23 21:21:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:21:59.234464 | orchestrator | 2025-09-23 21:21:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:21:59.234549 | orchestrator | 2025-09-23 21:21:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:02.282270 | orchestrator | 2025-09-23 21:22:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:02.285323 | orchestrator | 2025-09-23 21:22:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:02.285353 | orchestrator | 2025-09-23 21:22:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:05.337538 | orchestrator | 2025-09-23 21:22:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:05.338551 | orchestrator | 2025-09-23 21:22:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:05.338857 | orchestrator | 2025-09-23 21:22:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:08.385419 | orchestrator | 2025-09-23 21:22:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:08.387919 | orchestrator | 2025-09-23 21:22:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:08.387955 | orchestrator | 2025-09-23 21:22:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:11.437855 | orchestrator | 2025-09-23 21:22:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:11.438916 | orchestrator | 2025-09-23 21:22:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:11.438950 | orchestrator | 2025-09-23 21:22:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:14.483166 | orchestrator | 2025-09-23 21:22:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:14.485041 | orchestrator | 2025-09-23 21:22:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:14.485295 | orchestrator | 2025-09-23 21:22:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:17.533732 | orchestrator | 2025-09-23 21:22:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:17.534838 | orchestrator | 2025-09-23 21:22:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:17.535348 | orchestrator | 2025-09-23 21:22:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:20.581520 | orchestrator | 2025-09-23 21:22:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:20.582398 | orchestrator | 2025-09-23 21:22:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:20.582434 | orchestrator | 2025-09-23 21:22:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:23.626767 | orchestrator | 2025-09-23 21:22:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:23.628070 | orchestrator | 2025-09-23 21:22:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:23.628108 | orchestrator | 2025-09-23 21:22:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:26.677779 | orchestrator | 2025-09-23 21:22:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:26.679347 | orchestrator | 2025-09-23 21:22:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:26.679383 | orchestrator | 2025-09-23 21:22:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:29.723213 | orchestrator | 2025-09-23 21:22:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:29.724803 | orchestrator | 2025-09-23 21:22:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:29.725211 | orchestrator | 2025-09-23 21:22:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:32.766238 | orchestrator | 2025-09-23 21:22:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:32.767173 | orchestrator | 2025-09-23 21:22:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:32.767501 | orchestrator | 2025-09-23 21:22:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:35.815383 | orchestrator | 2025-09-23 21:22:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:35.816948 | orchestrator | 2025-09-23 21:22:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:35.817050 | orchestrator | 2025-09-23 21:22:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:38.858728 | orchestrator | 2025-09-23 21:22:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:38.860911 | orchestrator | 2025-09-23 21:22:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:38.860951 | orchestrator | 2025-09-23 21:22:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:41.905166 | orchestrator | 2025-09-23 21:22:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:41.907060 | orchestrator | 2025-09-23 21:22:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:41.907091 | orchestrator | 2025-09-23 21:22:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:44.948557 | orchestrator | 2025-09-23 21:22:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:44.950654 | orchestrator | 2025-09-23 21:22:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:44.950688 | orchestrator | 2025-09-23 21:22:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:47.995532 | orchestrator | 2025-09-23 21:22:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:47.995983 | orchestrator | 2025-09-23 21:22:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:47.996077 | orchestrator | 2025-09-23 21:22:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:51.045942 | orchestrator | 2025-09-23 21:22:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:51.046527 | orchestrator | 2025-09-23 21:22:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:51.046559 | orchestrator | 2025-09-23 21:22:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:54.095759 | orchestrator | 2025-09-23 21:22:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:54.097305 | orchestrator | 2025-09-23 21:22:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:54.097371 | orchestrator | 2025-09-23 21:22:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:22:57.142985 | orchestrator | 2025-09-23 21:22:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:22:57.145194 | orchestrator | 2025-09-23 21:22:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:22:57.145245 | orchestrator | 2025-09-23 21:22:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:00.188417 | orchestrator | 2025-09-23 21:23:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:00.189979 | orchestrator | 2025-09-23 21:23:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:00.190007 | orchestrator | 2025-09-23 21:23:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:03.231695 | orchestrator | 2025-09-23 21:23:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:03.232417 | orchestrator | 2025-09-23 21:23:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:03.232452 | orchestrator | 2025-09-23 21:23:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:06.277508 | orchestrator | 2025-09-23 21:23:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:06.279457 | orchestrator | 2025-09-23 21:23:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:06.279545 | orchestrator | 2025-09-23 21:23:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:09.325191 | orchestrator | 2025-09-23 21:23:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:09.326997 | orchestrator | 2025-09-23 21:23:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:09.327212 | orchestrator | 2025-09-23 21:23:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:12.374280 | orchestrator | 2025-09-23 21:23:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:12.375745 | orchestrator | 2025-09-23 21:23:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:12.375801 | orchestrator | 2025-09-23 21:23:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:15.418725 | orchestrator | 2025-09-23 21:23:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:15.419938 | orchestrator | 2025-09-23 21:23:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:15.419967 | orchestrator | 2025-09-23 21:23:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:18.462383 | orchestrator | 2025-09-23 21:23:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:18.463697 | orchestrator | 2025-09-23 21:23:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:18.463729 | orchestrator | 2025-09-23 21:23:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:21.511096 | orchestrator | 2025-09-23 21:23:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:21.512117 | orchestrator | 2025-09-23 21:23:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:21.512145 | orchestrator | 2025-09-23 21:23:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:24.559050 | orchestrator | 2025-09-23 21:23:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:24.561803 | orchestrator | 2025-09-23 21:23:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:24.562120 | orchestrator | 2025-09-23 21:23:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:27.607461 | orchestrator | 2025-09-23 21:23:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:27.609316 | orchestrator | 2025-09-23 21:23:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:27.609446 | orchestrator | 2025-09-23 21:23:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:30.644655 | orchestrator | 2025-09-23 21:23:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:30.645628 | orchestrator | 2025-09-23 21:23:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:30.645651 | orchestrator | 2025-09-23 21:23:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:33.691505 | orchestrator | 2025-09-23 21:23:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:33.694952 | orchestrator | 2025-09-23 21:23:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:33.695002 | orchestrator | 2025-09-23 21:23:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:36.745500 | orchestrator | 2025-09-23 21:23:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:36.748090 | orchestrator | 2025-09-23 21:23:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:36.748130 | orchestrator | 2025-09-23 21:23:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:39.796204 | orchestrator | 2025-09-23 21:23:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:39.797854 | orchestrator | 2025-09-23 21:23:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:39.797886 | orchestrator | 2025-09-23 21:23:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:42.845012 | orchestrator | 2025-09-23 21:23:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:42.846833 | orchestrator | 2025-09-23 21:23:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:42.846990 | orchestrator | 2025-09-23 21:23:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:45.890597 | orchestrator | 2025-09-23 21:23:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:45.893358 | orchestrator | 2025-09-23 21:23:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:45.893547 | orchestrator | 2025-09-23 21:23:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:48.939836 | orchestrator | 2025-09-23 21:23:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:48.941100 | orchestrator | 2025-09-23 21:23:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:48.941177 | orchestrator | 2025-09-23 21:23:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:51.985165 | orchestrator | 2025-09-23 21:23:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:51.987073 | orchestrator | 2025-09-23 21:23:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:51.987179 | orchestrator | 2025-09-23 21:23:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:55.036087 | orchestrator | 2025-09-23 21:23:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:55.038808 | orchestrator | 2025-09-23 21:23:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:55.038876 | orchestrator | 2025-09-23 21:23:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:23:58.085372 | orchestrator | 2025-09-23 21:23:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:23:58.086452 | orchestrator | 2025-09-23 21:23:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:23:58.086581 | orchestrator | 2025-09-23 21:23:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:01.131091 | orchestrator | 2025-09-23 21:24:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:01.132180 | orchestrator | 2025-09-23 21:24:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:01.132220 | orchestrator | 2025-09-23 21:24:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:04.178336 | orchestrator | 2025-09-23 21:24:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:04.179327 | orchestrator | 2025-09-23 21:24:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:04.179414 | orchestrator | 2025-09-23 21:24:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:07.224951 | orchestrator | 2025-09-23 21:24:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:07.226602 | orchestrator | 2025-09-23 21:24:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:07.226637 | orchestrator | 2025-09-23 21:24:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:10.270452 | orchestrator | 2025-09-23 21:24:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:10.272433 | orchestrator | 2025-09-23 21:24:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:10.272675 | orchestrator | 2025-09-23 21:24:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:13.319808 | orchestrator | 2025-09-23 21:24:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:13.321291 | orchestrator | 2025-09-23 21:24:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:13.321347 | orchestrator | 2025-09-23 21:24:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:16.367478 | orchestrator | 2025-09-23 21:24:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:16.369040 | orchestrator | 2025-09-23 21:24:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:16.369126 | orchestrator | 2025-09-23 21:24:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:19.418262 | orchestrator | 2025-09-23 21:24:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:19.420338 | orchestrator | 2025-09-23 21:24:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:19.420421 | orchestrator | 2025-09-23 21:24:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:22.466230 | orchestrator | 2025-09-23 21:24:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:22.466571 | orchestrator | 2025-09-23 21:24:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:22.466601 | orchestrator | 2025-09-23 21:24:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:25.508636 | orchestrator | 2025-09-23 21:24:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:25.509831 | orchestrator | 2025-09-23 21:24:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:25.509866 | orchestrator | 2025-09-23 21:24:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:28.556032 | orchestrator | 2025-09-23 21:24:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:28.557798 | orchestrator | 2025-09-23 21:24:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:28.557827 | orchestrator | 2025-09-23 21:24:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:31.605464 | orchestrator | 2025-09-23 21:24:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:31.606827 | orchestrator | 2025-09-23 21:24:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:31.606864 | orchestrator | 2025-09-23 21:24:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:34.648553 | orchestrator | 2025-09-23 21:24:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:34.650473 | orchestrator | 2025-09-23 21:24:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:34.650507 | orchestrator | 2025-09-23 21:24:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:37.698687 | orchestrator | 2025-09-23 21:24:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:37.700841 | orchestrator | 2025-09-23 21:24:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:37.700868 | orchestrator | 2025-09-23 21:24:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:40.746239 | orchestrator | 2025-09-23 21:24:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:40.747621 | orchestrator | 2025-09-23 21:24:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:40.747690 | orchestrator | 2025-09-23 21:24:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:43.804947 | orchestrator | 2025-09-23 21:24:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:43.805049 | orchestrator | 2025-09-23 21:24:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:43.805064 | orchestrator | 2025-09-23 21:24:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:46.851234 | orchestrator | 2025-09-23 21:24:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:46.853206 | orchestrator | 2025-09-23 21:24:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:46.853369 | orchestrator | 2025-09-23 21:24:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:49.901606 | orchestrator | 2025-09-23 21:24:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:49.903866 | orchestrator | 2025-09-23 21:24:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:49.903907 | orchestrator | 2025-09-23 21:24:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:52.948394 | orchestrator | 2025-09-23 21:24:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:52.949265 | orchestrator | 2025-09-23 21:24:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:52.949296 | orchestrator | 2025-09-23 21:24:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:55.992610 | orchestrator | 2025-09-23 21:24:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:55.993688 | orchestrator | 2025-09-23 21:24:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:55.994099 | orchestrator | 2025-09-23 21:24:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:24:59.042143 | orchestrator | 2025-09-23 21:24:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:24:59.043613 | orchestrator | 2025-09-23 21:24:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:24:59.043670 | orchestrator | 2025-09-23 21:24:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:02.093054 | orchestrator | 2025-09-23 21:25:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:02.094298 | orchestrator | 2025-09-23 21:25:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:02.094344 | orchestrator | 2025-09-23 21:25:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:05.138232 | orchestrator | 2025-09-23 21:25:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:05.139582 | orchestrator | 2025-09-23 21:25:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:05.139616 | orchestrator | 2025-09-23 21:25:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:08.181811 | orchestrator | 2025-09-23 21:25:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:08.183443 | orchestrator | 2025-09-23 21:25:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:08.183583 | orchestrator | 2025-09-23 21:25:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:11.225070 | orchestrator | 2025-09-23 21:25:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:11.225967 | orchestrator | 2025-09-23 21:25:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:11.226305 | orchestrator | 2025-09-23 21:25:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:14.274826 | orchestrator | 2025-09-23 21:25:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:14.276181 | orchestrator | 2025-09-23 21:25:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:14.276229 | orchestrator | 2025-09-23 21:25:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:17.326326 | orchestrator | 2025-09-23 21:25:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:17.328503 | orchestrator | 2025-09-23 21:25:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:17.328568 | orchestrator | 2025-09-23 21:25:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:20.372073 | orchestrator | 2025-09-23 21:25:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:20.373111 | orchestrator | 2025-09-23 21:25:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:20.373147 | orchestrator | 2025-09-23 21:25:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:23.408863 | orchestrator | 2025-09-23 21:25:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:23.411262 | orchestrator | 2025-09-23 21:25:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:23.411306 | orchestrator | 2025-09-23 21:25:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:26.455661 | orchestrator | 2025-09-23 21:25:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:26.458918 | orchestrator | 2025-09-23 21:25:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:26.459721 | orchestrator | 2025-09-23 21:25:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:29.509106 | orchestrator | 2025-09-23 21:25:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:29.511096 | orchestrator | 2025-09-23 21:25:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:29.511127 | orchestrator | 2025-09-23 21:25:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:32.567420 | orchestrator | 2025-09-23 21:25:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:32.595437 | orchestrator | 2025-09-23 21:25:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:32.595487 | orchestrator | 2025-09-23 21:25:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:35.618524 | orchestrator | 2025-09-23 21:25:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:35.620831 | orchestrator | 2025-09-23 21:25:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:35.620925 | orchestrator | 2025-09-23 21:25:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:38.668330 | orchestrator | 2025-09-23 21:25:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:38.669723 | orchestrator | 2025-09-23 21:25:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:38.669907 | orchestrator | 2025-09-23 21:25:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:41.715702 | orchestrator | 2025-09-23 21:25:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:41.716484 | orchestrator | 2025-09-23 21:25:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:41.716739 | orchestrator | 2025-09-23 21:25:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:44.760549 | orchestrator | 2025-09-23 21:25:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:44.762142 | orchestrator | 2025-09-23 21:25:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:44.762234 | orchestrator | 2025-09-23 21:25:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:47.809025 | orchestrator | 2025-09-23 21:25:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:47.810885 | orchestrator | 2025-09-23 21:25:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:47.810922 | orchestrator | 2025-09-23 21:25:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:50.854967 | orchestrator | 2025-09-23 21:25:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:50.856614 | orchestrator | 2025-09-23 21:25:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:50.856652 | orchestrator | 2025-09-23 21:25:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:53.904854 | orchestrator | 2025-09-23 21:25:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:53.906633 | orchestrator | 2025-09-23 21:25:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:53.906778 | orchestrator | 2025-09-23 21:25:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:56.951131 | orchestrator | 2025-09-23 21:25:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:25:56.952975 | orchestrator | 2025-09-23 21:25:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:25:56.953234 | orchestrator | 2025-09-23 21:25:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:25:59.998431 | orchestrator | 2025-09-23 21:25:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:00.000235 | orchestrator | 2025-09-23 21:25:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:00.000386 | orchestrator | 2025-09-23 21:25:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:03.047644 | orchestrator | 2025-09-23 21:26:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:03.049101 | orchestrator | 2025-09-23 21:26:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:03.049813 | orchestrator | 2025-09-23 21:26:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:06.094471 | orchestrator | 2025-09-23 21:26:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:06.096208 | orchestrator | 2025-09-23 21:26:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:06.096405 | orchestrator | 2025-09-23 21:26:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:09.143026 | orchestrator | 2025-09-23 21:26:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:09.144380 | orchestrator | 2025-09-23 21:26:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:09.144484 | orchestrator | 2025-09-23 21:26:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:12.189954 | orchestrator | 2025-09-23 21:26:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:12.191112 | orchestrator | 2025-09-23 21:26:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:12.191144 | orchestrator | 2025-09-23 21:26:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:15.236543 | orchestrator | 2025-09-23 21:26:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:15.238235 | orchestrator | 2025-09-23 21:26:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:15.238327 | orchestrator | 2025-09-23 21:26:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:18.285961 | orchestrator | 2025-09-23 21:26:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:18.288588 | orchestrator | 2025-09-23 21:26:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:18.288641 | orchestrator | 2025-09-23 21:26:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:21.330724 | orchestrator | 2025-09-23 21:26:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:21.332491 | orchestrator | 2025-09-23 21:26:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:21.332544 | orchestrator | 2025-09-23 21:26:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:24.376649 | orchestrator | 2025-09-23 21:26:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:24.378141 | orchestrator | 2025-09-23 21:26:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:24.378174 | orchestrator | 2025-09-23 21:26:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:27.428362 | orchestrator | 2025-09-23 21:26:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:27.430432 | orchestrator | 2025-09-23 21:26:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:27.430510 | orchestrator | 2025-09-23 21:26:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:30.480326 | orchestrator | 2025-09-23 21:26:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:30.483330 | orchestrator | 2025-09-23 21:26:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:30.483433 | orchestrator | 2025-09-23 21:26:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:33.523059 | orchestrator | 2025-09-23 21:26:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:33.524301 | orchestrator | 2025-09-23 21:26:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:33.524380 | orchestrator | 2025-09-23 21:26:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:36.569773 | orchestrator | 2025-09-23 21:26:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:36.571034 | orchestrator | 2025-09-23 21:26:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:36.571115 | orchestrator | 2025-09-23 21:26:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:39.616577 | orchestrator | 2025-09-23 21:26:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:39.618857 | orchestrator | 2025-09-23 21:26:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:39.618892 | orchestrator | 2025-09-23 21:26:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:42.666285 | orchestrator | 2025-09-23 21:26:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:42.668123 | orchestrator | 2025-09-23 21:26:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:42.668187 | orchestrator | 2025-09-23 21:26:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:45.715146 | orchestrator | 2025-09-23 21:26:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:45.717933 | orchestrator | 2025-09-23 21:26:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:45.718080 | orchestrator | 2025-09-23 21:26:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:48.762195 | orchestrator | 2025-09-23 21:26:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:48.763776 | orchestrator | 2025-09-23 21:26:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:48.763820 | orchestrator | 2025-09-23 21:26:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:51.815263 | orchestrator | 2025-09-23 21:26:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:51.817300 | orchestrator | 2025-09-23 21:26:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:51.817369 | orchestrator | 2025-09-23 21:26:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:54.857955 | orchestrator | 2025-09-23 21:26:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:54.859681 | orchestrator | 2025-09-23 21:26:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:54.859715 | orchestrator | 2025-09-23 21:26:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:26:57.902375 | orchestrator | 2025-09-23 21:26:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:26:57.903651 | orchestrator | 2025-09-23 21:26:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:26:57.903729 | orchestrator | 2025-09-23 21:26:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:00.948389 | orchestrator | 2025-09-23 21:27:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:00.950301 | orchestrator | 2025-09-23 21:27:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:00.950338 | orchestrator | 2025-09-23 21:27:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:03.986945 | orchestrator | 2025-09-23 21:27:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:03.987746 | orchestrator | 2025-09-23 21:27:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:03.987782 | orchestrator | 2025-09-23 21:27:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:07.031255 | orchestrator | 2025-09-23 21:27:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:07.032134 | orchestrator | 2025-09-23 21:27:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:07.032250 | orchestrator | 2025-09-23 21:27:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:10.074112 | orchestrator | 2025-09-23 21:27:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:10.075929 | orchestrator | 2025-09-23 21:27:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:10.076183 | orchestrator | 2025-09-23 21:27:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:13.117406 | orchestrator | 2025-09-23 21:27:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:13.117831 | orchestrator | 2025-09-23 21:27:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:13.118151 | orchestrator | 2025-09-23 21:27:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:16.165266 | orchestrator | 2025-09-23 21:27:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:16.167270 | orchestrator | 2025-09-23 21:27:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:16.167321 | orchestrator | 2025-09-23 21:27:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:19.214940 | orchestrator | 2025-09-23 21:27:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:19.216775 | orchestrator | 2025-09-23 21:27:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:19.216834 | orchestrator | 2025-09-23 21:27:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:22.259915 | orchestrator | 2025-09-23 21:27:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:22.261458 | orchestrator | 2025-09-23 21:27:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:22.261490 | orchestrator | 2025-09-23 21:27:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:25.308363 | orchestrator | 2025-09-23 21:27:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:25.308596 | orchestrator | 2025-09-23 21:27:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:25.308621 | orchestrator | 2025-09-23 21:27:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:28.353233 | orchestrator | 2025-09-23 21:27:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:28.355170 | orchestrator | 2025-09-23 21:27:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:28.355206 | orchestrator | 2025-09-23 21:27:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:31.391425 | orchestrator | 2025-09-23 21:27:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:31.392902 | orchestrator | 2025-09-23 21:27:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:31.393052 | orchestrator | 2025-09-23 21:27:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:34.439736 | orchestrator | 2025-09-23 21:27:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:34.441522 | orchestrator | 2025-09-23 21:27:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:34.441607 | orchestrator | 2025-09-23 21:27:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:37.487884 | orchestrator | 2025-09-23 21:27:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:37.490156 | orchestrator | 2025-09-23 21:27:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:37.490226 | orchestrator | 2025-09-23 21:27:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:40.533093 | orchestrator | 2025-09-23 21:27:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:40.534749 | orchestrator | 2025-09-23 21:27:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:40.534804 | orchestrator | 2025-09-23 21:27:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:43.579549 | orchestrator | 2025-09-23 21:27:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:43.580835 | orchestrator | 2025-09-23 21:27:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:43.580867 | orchestrator | 2025-09-23 21:27:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:46.626181 | orchestrator | 2025-09-23 21:27:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:46.627880 | orchestrator | 2025-09-23 21:27:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:46.627932 | orchestrator | 2025-09-23 21:27:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:49.668051 | orchestrator | 2025-09-23 21:27:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:49.669986 | orchestrator | 2025-09-23 21:27:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:49.670066 | orchestrator | 2025-09-23 21:27:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:52.712233 | orchestrator | 2025-09-23 21:27:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:52.713889 | orchestrator | 2025-09-23 21:27:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:52.713922 | orchestrator | 2025-09-23 21:27:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:55.758175 | orchestrator | 2025-09-23 21:27:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:55.759643 | orchestrator | 2025-09-23 21:27:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:55.759680 | orchestrator | 2025-09-23 21:27:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:27:58.811074 | orchestrator | 2025-09-23 21:27:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:27:58.812065 | orchestrator | 2025-09-23 21:27:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:27:58.812183 | orchestrator | 2025-09-23 21:27:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:01.860780 | orchestrator | 2025-09-23 21:28:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:01.861884 | orchestrator | 2025-09-23 21:28:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:01.861997 | orchestrator | 2025-09-23 21:28:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:04.910013 | orchestrator | 2025-09-23 21:28:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:04.911739 | orchestrator | 2025-09-23 21:28:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:04.911774 | orchestrator | 2025-09-23 21:28:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:07.955970 | orchestrator | 2025-09-23 21:28:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:07.957317 | orchestrator | 2025-09-23 21:28:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:07.957579 | orchestrator | 2025-09-23 21:28:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:10.997107 | orchestrator | 2025-09-23 21:28:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:10.998577 | orchestrator | 2025-09-23 21:28:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:10.998730 | orchestrator | 2025-09-23 21:28:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:14.043729 | orchestrator | 2025-09-23 21:28:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:14.046169 | orchestrator | 2025-09-23 21:28:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:14.046214 | orchestrator | 2025-09-23 21:28:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:17.091572 | orchestrator | 2025-09-23 21:28:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:17.094685 | orchestrator | 2025-09-23 21:28:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:17.095327 | orchestrator | 2025-09-23 21:28:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:20.142156 | orchestrator | 2025-09-23 21:28:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:20.144578 | orchestrator | 2025-09-23 21:28:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:20.144704 | orchestrator | 2025-09-23 21:28:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:23.183021 | orchestrator | 2025-09-23 21:28:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:23.185313 | orchestrator | 2025-09-23 21:28:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:23.185533 | orchestrator | 2025-09-23 21:28:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:26.223531 | orchestrator | 2025-09-23 21:28:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:26.225362 | orchestrator | 2025-09-23 21:28:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:26.225486 | orchestrator | 2025-09-23 21:28:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:29.268164 | orchestrator | 2025-09-23 21:28:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:29.270124 | orchestrator | 2025-09-23 21:28:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:29.270158 | orchestrator | 2025-09-23 21:28:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:32.313361 | orchestrator | 2025-09-23 21:28:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:32.315548 | orchestrator | 2025-09-23 21:28:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:32.315669 | orchestrator | 2025-09-23 21:28:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:35.350543 | orchestrator | 2025-09-23 21:28:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:35.351094 | orchestrator | 2025-09-23 21:28:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:35.351128 | orchestrator | 2025-09-23 21:28:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:38.394664 | orchestrator | 2025-09-23 21:28:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:38.395797 | orchestrator | 2025-09-23 21:28:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:38.395831 | orchestrator | 2025-09-23 21:28:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:41.435373 | orchestrator | 2025-09-23 21:28:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:41.438137 | orchestrator | 2025-09-23 21:28:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:41.438189 | orchestrator | 2025-09-23 21:28:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:44.481468 | orchestrator | 2025-09-23 21:28:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:44.482510 | orchestrator | 2025-09-23 21:28:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:44.482542 | orchestrator | 2025-09-23 21:28:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:47.526846 | orchestrator | 2025-09-23 21:28:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:47.527038 | orchestrator | 2025-09-23 21:28:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:47.527061 | orchestrator | 2025-09-23 21:28:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:50.574087 | orchestrator | 2025-09-23 21:28:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:50.575855 | orchestrator | 2025-09-23 21:28:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:50.575888 | orchestrator | 2025-09-23 21:28:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:53.622704 | orchestrator | 2025-09-23 21:28:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:53.624025 | orchestrator | 2025-09-23 21:28:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:53.624060 | orchestrator | 2025-09-23 21:28:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:56.665486 | orchestrator | 2025-09-23 21:28:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:56.666717 | orchestrator | 2025-09-23 21:28:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:56.666765 | orchestrator | 2025-09-23 21:28:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:28:59.711806 | orchestrator | 2025-09-23 21:28:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:28:59.713758 | orchestrator | 2025-09-23 21:28:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:28:59.713809 | orchestrator | 2025-09-23 21:28:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:02.757657 | orchestrator | 2025-09-23 21:29:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:02.759347 | orchestrator | 2025-09-23 21:29:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:02.759572 | orchestrator | 2025-09-23 21:29:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:05.798479 | orchestrator | 2025-09-23 21:29:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:05.799690 | orchestrator | 2025-09-23 21:29:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:05.799727 | orchestrator | 2025-09-23 21:29:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:08.844746 | orchestrator | 2025-09-23 21:29:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:08.846626 | orchestrator | 2025-09-23 21:29:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:08.846661 | orchestrator | 2025-09-23 21:29:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:11.894868 | orchestrator | 2025-09-23 21:29:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:11.896246 | orchestrator | 2025-09-23 21:29:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:11.896466 | orchestrator | 2025-09-23 21:29:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:14.943277 | orchestrator | 2025-09-23 21:29:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:14.944623 | orchestrator | 2025-09-23 21:29:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:14.944667 | orchestrator | 2025-09-23 21:29:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:17.989174 | orchestrator | 2025-09-23 21:29:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:17.991519 | orchestrator | 2025-09-23 21:29:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:17.991745 | orchestrator | 2025-09-23 21:29:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:21.032963 | orchestrator | 2025-09-23 21:29:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:21.033814 | orchestrator | 2025-09-23 21:29:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:21.033844 | orchestrator | 2025-09-23 21:29:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:24.077588 | orchestrator | 2025-09-23 21:29:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:24.078332 | orchestrator | 2025-09-23 21:29:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:24.078698 | orchestrator | 2025-09-23 21:29:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:27.124615 | orchestrator | 2025-09-23 21:29:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:27.126202 | orchestrator | 2025-09-23 21:29:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:27.126236 | orchestrator | 2025-09-23 21:29:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:30.168499 | orchestrator | 2025-09-23 21:29:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:30.169984 | orchestrator | 2025-09-23 21:29:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:30.170009 | orchestrator | 2025-09-23 21:29:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:33.211474 | orchestrator | 2025-09-23 21:29:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:33.212684 | orchestrator | 2025-09-23 21:29:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:33.212722 | orchestrator | 2025-09-23 21:29:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:36.253560 | orchestrator | 2025-09-23 21:29:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:36.255992 | orchestrator | 2025-09-23 21:29:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:36.256055 | orchestrator | 2025-09-23 21:29:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:39.301629 | orchestrator | 2025-09-23 21:29:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:39.304265 | orchestrator | 2025-09-23 21:29:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:39.304385 | orchestrator | 2025-09-23 21:29:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:42.342460 | orchestrator | 2025-09-23 21:29:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:42.344616 | orchestrator | 2025-09-23 21:29:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:42.344956 | orchestrator | 2025-09-23 21:29:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:45.392013 | orchestrator | 2025-09-23 21:29:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:45.393414 | orchestrator | 2025-09-23 21:29:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:45.393444 | orchestrator | 2025-09-23 21:29:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:48.438524 | orchestrator | 2025-09-23 21:29:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:48.439791 | orchestrator | 2025-09-23 21:29:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:48.439876 | orchestrator | 2025-09-23 21:29:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:51.484092 | orchestrator | 2025-09-23 21:29:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:51.485829 | orchestrator | 2025-09-23 21:29:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:51.485947 | orchestrator | 2025-09-23 21:29:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:54.530674 | orchestrator | 2025-09-23 21:29:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:54.532823 | orchestrator | 2025-09-23 21:29:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:54.532856 | orchestrator | 2025-09-23 21:29:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:29:57.576265 | orchestrator | 2025-09-23 21:29:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:29:57.577277 | orchestrator | 2025-09-23 21:29:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:29:57.577310 | orchestrator | 2025-09-23 21:29:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:00.621104 | orchestrator | 2025-09-23 21:30:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:00.622927 | orchestrator | 2025-09-23 21:30:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:00.622974 | orchestrator | 2025-09-23 21:30:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:03.667793 | orchestrator | 2025-09-23 21:30:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:03.669834 | orchestrator | 2025-09-23 21:30:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:03.669866 | orchestrator | 2025-09-23 21:30:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:06.718007 | orchestrator | 2025-09-23 21:30:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:06.719207 | orchestrator | 2025-09-23 21:30:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:06.719246 | orchestrator | 2025-09-23 21:30:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:09.759349 | orchestrator | 2025-09-23 21:30:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:09.760964 | orchestrator | 2025-09-23 21:30:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:09.761277 | orchestrator | 2025-09-23 21:30:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:12.800629 | orchestrator | 2025-09-23 21:30:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:12.802086 | orchestrator | 2025-09-23 21:30:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:12.802208 | orchestrator | 2025-09-23 21:30:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:15.843929 | orchestrator | 2025-09-23 21:30:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:15.845164 | orchestrator | 2025-09-23 21:30:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:15.845251 | orchestrator | 2025-09-23 21:30:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:18.889870 | orchestrator | 2025-09-23 21:30:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:18.890895 | orchestrator | 2025-09-23 21:30:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:18.890934 | orchestrator | 2025-09-23 21:30:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:21.932468 | orchestrator | 2025-09-23 21:30:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:21.934446 | orchestrator | 2025-09-23 21:30:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:21.934514 | orchestrator | 2025-09-23 21:30:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:24.974645 | orchestrator | 2025-09-23 21:30:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:24.975917 | orchestrator | 2025-09-23 21:30:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:24.975952 | orchestrator | 2025-09-23 21:30:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:28.020811 | orchestrator | 2025-09-23 21:30:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:28.022661 | orchestrator | 2025-09-23 21:30:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:28.022696 | orchestrator | 2025-09-23 21:30:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:31.071391 | orchestrator | 2025-09-23 21:30:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:31.071585 | orchestrator | 2025-09-23 21:30:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:31.071606 | orchestrator | 2025-09-23 21:30:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:34.111576 | orchestrator | 2025-09-23 21:30:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:34.113072 | orchestrator | 2025-09-23 21:30:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:34.113111 | orchestrator | 2025-09-23 21:30:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:37.156913 | orchestrator | 2025-09-23 21:30:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:37.158326 | orchestrator | 2025-09-23 21:30:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:37.158412 | orchestrator | 2025-09-23 21:30:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:40.201136 | orchestrator | 2025-09-23 21:30:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:40.202077 | orchestrator | 2025-09-23 21:30:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:40.202117 | orchestrator | 2025-09-23 21:30:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:43.248253 | orchestrator | 2025-09-23 21:30:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:43.250098 | orchestrator | 2025-09-23 21:30:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:43.250172 | orchestrator | 2025-09-23 21:30:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:46.301421 | orchestrator | 2025-09-23 21:30:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:46.302881 | orchestrator | 2025-09-23 21:30:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:46.302943 | orchestrator | 2025-09-23 21:30:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:49.339993 | orchestrator | 2025-09-23 21:30:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:49.342542 | orchestrator | 2025-09-23 21:30:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:49.342648 | orchestrator | 2025-09-23 21:30:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:52.389544 | orchestrator | 2025-09-23 21:30:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:52.390617 | orchestrator | 2025-09-23 21:30:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:52.390651 | orchestrator | 2025-09-23 21:30:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:55.437599 | orchestrator | 2025-09-23 21:30:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:55.438738 | orchestrator | 2025-09-23 21:30:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:55.438772 | orchestrator | 2025-09-23 21:30:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:30:58.481573 | orchestrator | 2025-09-23 21:30:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:30:58.481980 | orchestrator | 2025-09-23 21:30:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:30:58.482014 | orchestrator | 2025-09-23 21:30:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:01.529258 | orchestrator | 2025-09-23 21:31:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:01.529611 | orchestrator | 2025-09-23 21:31:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:01.529641 | orchestrator | 2025-09-23 21:31:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:04.570483 | orchestrator | 2025-09-23 21:31:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:04.571608 | orchestrator | 2025-09-23 21:31:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:04.571696 | orchestrator | 2025-09-23 21:31:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:07.616245 | orchestrator | 2025-09-23 21:31:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:07.617405 | orchestrator | 2025-09-23 21:31:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:07.617474 | orchestrator | 2025-09-23 21:31:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:10.666003 | orchestrator | 2025-09-23 21:31:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:10.668026 | orchestrator | 2025-09-23 21:31:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:10.668066 | orchestrator | 2025-09-23 21:31:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:13.721477 | orchestrator | 2025-09-23 21:31:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:13.722678 | orchestrator | 2025-09-23 21:31:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:13.722714 | orchestrator | 2025-09-23 21:31:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:16.770951 | orchestrator | 2025-09-23 21:31:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:16.772834 | orchestrator | 2025-09-23 21:31:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:16.773088 | orchestrator | 2025-09-23 21:31:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:19.823467 | orchestrator | 2025-09-23 21:31:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:19.824290 | orchestrator | 2025-09-23 21:31:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:19.824344 | orchestrator | 2025-09-23 21:31:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:22.872471 | orchestrator | 2025-09-23 21:31:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:22.873381 | orchestrator | 2025-09-23 21:31:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:22.873412 | orchestrator | 2025-09-23 21:31:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:25.917016 | orchestrator | 2025-09-23 21:31:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:25.917884 | orchestrator | 2025-09-23 21:31:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:25.918057 | orchestrator | 2025-09-23 21:31:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:28.964525 | orchestrator | 2025-09-23 21:31:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:28.966064 | orchestrator | 2025-09-23 21:31:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:28.966154 | orchestrator | 2025-09-23 21:31:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:32.014993 | orchestrator | 2025-09-23 21:31:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:32.015972 | orchestrator | 2025-09-23 21:31:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:32.016270 | orchestrator | 2025-09-23 21:31:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:35.059245 | orchestrator | 2025-09-23 21:31:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:35.061353 | orchestrator | 2025-09-23 21:31:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:35.061386 | orchestrator | 2025-09-23 21:31:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:38.108753 | orchestrator | 2025-09-23 21:31:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:38.111210 | orchestrator | 2025-09-23 21:31:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:38.111426 | orchestrator | 2025-09-23 21:31:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:41.156102 | orchestrator | 2025-09-23 21:31:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:41.157932 | orchestrator | 2025-09-23 21:31:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:41.157966 | orchestrator | 2025-09-23 21:31:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:44.201967 | orchestrator | 2025-09-23 21:31:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:44.204407 | orchestrator | 2025-09-23 21:31:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:44.204448 | orchestrator | 2025-09-23 21:31:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:47.249149 | orchestrator | 2025-09-23 21:31:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:47.249750 | orchestrator | 2025-09-23 21:31:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:47.249824 | orchestrator | 2025-09-23 21:31:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:50.292713 | orchestrator | 2025-09-23 21:31:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:50.294669 | orchestrator | 2025-09-23 21:31:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:50.294720 | orchestrator | 2025-09-23 21:31:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:53.341072 | orchestrator | 2025-09-23 21:31:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:53.342010 | orchestrator | 2025-09-23 21:31:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:53.342099 | orchestrator | 2025-09-23 21:31:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:56.388420 | orchestrator | 2025-09-23 21:31:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:56.391812 | orchestrator | 2025-09-23 21:31:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:56.391846 | orchestrator | 2025-09-23 21:31:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:31:59.435187 | orchestrator | 2025-09-23 21:31:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:31:59.436932 | orchestrator | 2025-09-23 21:31:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:31:59.436965 | orchestrator | 2025-09-23 21:31:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:02.477483 | orchestrator | 2025-09-23 21:32:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:02.479366 | orchestrator | 2025-09-23 21:32:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:02.479403 | orchestrator | 2025-09-23 21:32:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:05.517970 | orchestrator | 2025-09-23 21:32:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:05.519416 | orchestrator | 2025-09-23 21:32:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:05.519507 | orchestrator | 2025-09-23 21:32:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:08.560238 | orchestrator | 2025-09-23 21:32:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:08.562483 | orchestrator | 2025-09-23 21:32:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:08.562651 | orchestrator | 2025-09-23 21:32:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:11.614145 | orchestrator | 2025-09-23 21:32:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:11.616283 | orchestrator | 2025-09-23 21:32:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:11.616387 | orchestrator | 2025-09-23 21:32:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:14.659453 | orchestrator | 2025-09-23 21:32:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:14.663187 | orchestrator | 2025-09-23 21:32:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:14.663225 | orchestrator | 2025-09-23 21:32:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:17.709625 | orchestrator | 2025-09-23 21:32:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:17.711393 | orchestrator | 2025-09-23 21:32:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:17.793092 | orchestrator | 2025-09-23 21:32:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:20.757558 | orchestrator | 2025-09-23 21:32:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:20.759616 | orchestrator | 2025-09-23 21:32:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:20.759948 | orchestrator | 2025-09-23 21:32:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:23.804628 | orchestrator | 2025-09-23 21:32:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:23.806592 | orchestrator | 2025-09-23 21:32:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:23.806731 | orchestrator | 2025-09-23 21:32:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:26.854766 | orchestrator | 2025-09-23 21:32:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:26.856200 | orchestrator | 2025-09-23 21:32:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:26.856228 | orchestrator | 2025-09-23 21:32:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:29.900254 | orchestrator | 2025-09-23 21:32:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:29.901511 | orchestrator | 2025-09-23 21:32:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:29.901539 | orchestrator | 2025-09-23 21:32:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:32.939651 | orchestrator | 2025-09-23 21:32:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:32.940350 | orchestrator | 2025-09-23 21:32:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:32.940397 | orchestrator | 2025-09-23 21:32:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:35.990473 | orchestrator | 2025-09-23 21:32:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:35.992137 | orchestrator | 2025-09-23 21:32:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:35.992402 | orchestrator | 2025-09-23 21:32:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:39.034349 | orchestrator | 2025-09-23 21:32:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:39.035645 | orchestrator | 2025-09-23 21:32:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:39.035931 | orchestrator | 2025-09-23 21:32:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:42.086731 | orchestrator | 2025-09-23 21:32:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:42.088226 | orchestrator | 2025-09-23 21:32:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:42.088418 | orchestrator | 2025-09-23 21:32:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:45.130505 | orchestrator | 2025-09-23 21:32:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:45.131151 | orchestrator | 2025-09-23 21:32:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:45.131179 | orchestrator | 2025-09-23 21:32:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:48.177786 | orchestrator | 2025-09-23 21:32:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:48.179473 | orchestrator | 2025-09-23 21:32:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:48.179503 | orchestrator | 2025-09-23 21:32:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:51.223101 | orchestrator | 2025-09-23 21:32:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:51.224557 | orchestrator | 2025-09-23 21:32:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:51.224594 | orchestrator | 2025-09-23 21:32:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:54.265513 | orchestrator | 2025-09-23 21:32:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:54.267301 | orchestrator | 2025-09-23 21:32:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:54.267337 | orchestrator | 2025-09-23 21:32:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:32:57.310777 | orchestrator | 2025-09-23 21:32:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:32:57.312814 | orchestrator | 2025-09-23 21:32:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:32:57.313965 | orchestrator | 2025-09-23 21:32:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:00.360844 | orchestrator | 2025-09-23 21:33:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:00.362513 | orchestrator | 2025-09-23 21:33:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:00.362670 | orchestrator | 2025-09-23 21:33:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:03.408433 | orchestrator | 2025-09-23 21:33:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:03.410247 | orchestrator | 2025-09-23 21:33:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:03.410321 | orchestrator | 2025-09-23 21:33:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:06.450353 | orchestrator | 2025-09-23 21:33:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:06.451822 | orchestrator | 2025-09-23 21:33:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:06.451945 | orchestrator | 2025-09-23 21:33:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:09.497175 | orchestrator | 2025-09-23 21:33:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:09.498604 | orchestrator | 2025-09-23 21:33:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:09.498680 | orchestrator | 2025-09-23 21:33:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:12.545211 | orchestrator | 2025-09-23 21:33:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:12.547624 | orchestrator | 2025-09-23 21:33:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:12.547789 | orchestrator | 2025-09-23 21:33:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:15.594719 | orchestrator | 2025-09-23 21:33:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:15.595589 | orchestrator | 2025-09-23 21:33:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:15.595770 | orchestrator | 2025-09-23 21:33:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:18.639602 | orchestrator | 2025-09-23 21:33:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:18.640938 | orchestrator | 2025-09-23 21:33:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:18.640992 | orchestrator | 2025-09-23 21:33:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:21.688096 | orchestrator | 2025-09-23 21:33:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:21.690116 | orchestrator | 2025-09-23 21:33:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:21.690167 | orchestrator | 2025-09-23 21:33:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:24.735701 | orchestrator | 2025-09-23 21:33:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:24.737121 | orchestrator | 2025-09-23 21:33:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:24.737169 | orchestrator | 2025-09-23 21:33:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:27.783687 | orchestrator | 2025-09-23 21:33:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:27.784906 | orchestrator | 2025-09-23 21:33:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:27.785029 | orchestrator | 2025-09-23 21:33:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:30.829450 | orchestrator | 2025-09-23 21:33:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:30.830705 | orchestrator | 2025-09-23 21:33:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:30.830888 | orchestrator | 2025-09-23 21:33:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:33.868055 | orchestrator | 2025-09-23 21:33:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:33.869879 | orchestrator | 2025-09-23 21:33:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:33.870153 | orchestrator | 2025-09-23 21:33:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:36.912424 | orchestrator | 2025-09-23 21:33:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:36.915228 | orchestrator | 2025-09-23 21:33:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:36.915373 | orchestrator | 2025-09-23 21:33:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:39.961932 | orchestrator | 2025-09-23 21:33:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:39.963948 | orchestrator | 2025-09-23 21:33:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:39.963988 | orchestrator | 2025-09-23 21:33:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:43.011321 | orchestrator | 2025-09-23 21:33:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:43.012575 | orchestrator | 2025-09-23 21:33:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:43.012608 | orchestrator | 2025-09-23 21:33:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:46.054916 | orchestrator | 2025-09-23 21:33:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:46.055918 | orchestrator | 2025-09-23 21:33:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:46.055954 | orchestrator | 2025-09-23 21:33:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:49.099394 | orchestrator | 2025-09-23 21:33:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:49.100925 | orchestrator | 2025-09-23 21:33:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:49.100952 | orchestrator | 2025-09-23 21:33:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:52.142500 | orchestrator | 2025-09-23 21:33:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:52.142670 | orchestrator | 2025-09-23 21:33:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:52.142689 | orchestrator | 2025-09-23 21:33:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:55.187036 | orchestrator | 2025-09-23 21:33:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:55.187941 | orchestrator | 2025-09-23 21:33:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:55.187986 | orchestrator | 2025-09-23 21:33:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:33:58.233218 | orchestrator | 2025-09-23 21:33:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:33:58.234313 | orchestrator | 2025-09-23 21:33:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:33:58.234711 | orchestrator | 2025-09-23 21:33:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:01.275210 | orchestrator | 2025-09-23 21:34:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:01.276780 | orchestrator | 2025-09-23 21:34:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:01.276812 | orchestrator | 2025-09-23 21:34:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:04.322287 | orchestrator | 2025-09-23 21:34:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:04.323489 | orchestrator | 2025-09-23 21:34:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:04.323530 | orchestrator | 2025-09-23 21:34:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:07.373115 | orchestrator | 2025-09-23 21:34:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:07.374253 | orchestrator | 2025-09-23 21:34:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:07.374298 | orchestrator | 2025-09-23 21:34:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:10.420616 | orchestrator | 2025-09-23 21:34:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:10.422439 | orchestrator | 2025-09-23 21:34:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:10.422667 | orchestrator | 2025-09-23 21:34:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:13.467851 | orchestrator | 2025-09-23 21:34:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:13.468404 | orchestrator | 2025-09-23 21:34:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:13.468449 | orchestrator | 2025-09-23 21:34:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:16.511563 | orchestrator | 2025-09-23 21:34:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:16.513772 | orchestrator | 2025-09-23 21:34:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:16.513851 | orchestrator | 2025-09-23 21:34:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:19.564034 | orchestrator | 2025-09-23 21:34:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:19.565533 | orchestrator | 2025-09-23 21:34:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:19.565672 | orchestrator | 2025-09-23 21:34:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:22.614299 | orchestrator | 2025-09-23 21:34:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:22.615154 | orchestrator | 2025-09-23 21:34:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:22.615191 | orchestrator | 2025-09-23 21:34:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:25.654884 | orchestrator | 2025-09-23 21:34:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:25.656058 | orchestrator | 2025-09-23 21:34:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:25.656135 | orchestrator | 2025-09-23 21:34:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:28.702149 | orchestrator | 2025-09-23 21:34:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:28.702871 | orchestrator | 2025-09-23 21:34:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:28.703292 | orchestrator | 2025-09-23 21:34:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:31.749580 | orchestrator | 2025-09-23 21:34:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:31.750442 | orchestrator | 2025-09-23 21:34:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:31.750489 | orchestrator | 2025-09-23 21:34:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:34.797109 | orchestrator | 2025-09-23 21:34:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:34.798479 | orchestrator | 2025-09-23 21:34:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:34.798539 | orchestrator | 2025-09-23 21:34:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:37.839521 | orchestrator | 2025-09-23 21:34:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:37.840429 | orchestrator | 2025-09-23 21:34:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:37.840749 | orchestrator | 2025-09-23 21:34:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:40.883427 | orchestrator | 2025-09-23 21:34:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:40.885003 | orchestrator | 2025-09-23 21:34:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:40.885055 | orchestrator | 2025-09-23 21:34:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:43.934659 | orchestrator | 2025-09-23 21:34:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:43.935963 | orchestrator | 2025-09-23 21:34:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:43.936280 | orchestrator | 2025-09-23 21:34:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:46.977917 | orchestrator | 2025-09-23 21:34:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:46.978367 | orchestrator | 2025-09-23 21:34:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:46.978445 | orchestrator | 2025-09-23 21:34:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:50.026798 | orchestrator | 2025-09-23 21:34:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:50.029156 | orchestrator | 2025-09-23 21:34:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:50.029262 | orchestrator | 2025-09-23 21:34:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:53.067961 | orchestrator | 2025-09-23 21:34:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:53.069564 | orchestrator | 2025-09-23 21:34:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:53.069599 | orchestrator | 2025-09-23 21:34:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:56.112033 | orchestrator | 2025-09-23 21:34:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:56.113736 | orchestrator | 2025-09-23 21:34:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:56.113777 | orchestrator | 2025-09-23 21:34:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:34:59.166599 | orchestrator | 2025-09-23 21:34:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:34:59.167816 | orchestrator | 2025-09-23 21:34:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:34:59.167961 | orchestrator | 2025-09-23 21:34:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:02.218569 | orchestrator | 2025-09-23 21:35:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:02.220715 | orchestrator | 2025-09-23 21:35:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:02.221030 | orchestrator | 2025-09-23 21:35:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:05.264626 | orchestrator | 2025-09-23 21:35:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:05.265279 | orchestrator | 2025-09-23 21:35:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:05.265313 | orchestrator | 2025-09-23 21:35:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:08.303235 | orchestrator | 2025-09-23 21:35:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:08.304411 | orchestrator | 2025-09-23 21:35:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:08.304438 | orchestrator | 2025-09-23 21:35:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:11.346257 | orchestrator | 2025-09-23 21:35:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:11.347835 | orchestrator | 2025-09-23 21:35:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:11.347938 | orchestrator | 2025-09-23 21:35:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:14.397078 | orchestrator | 2025-09-23 21:35:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:14.398462 | orchestrator | 2025-09-23 21:35:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:14.398496 | orchestrator | 2025-09-23 21:35:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:17.441814 | orchestrator | 2025-09-23 21:35:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:17.442909 | orchestrator | 2025-09-23 21:35:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:17.443132 | orchestrator | 2025-09-23 21:35:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:20.487871 | orchestrator | 2025-09-23 21:35:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:20.488791 | orchestrator | 2025-09-23 21:35:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:20.488821 | orchestrator | 2025-09-23 21:35:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:23.533571 | orchestrator | 2025-09-23 21:35:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:23.535757 | orchestrator | 2025-09-23 21:35:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:23.535848 | orchestrator | 2025-09-23 21:35:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:26.579293 | orchestrator | 2025-09-23 21:35:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:26.580396 | orchestrator | 2025-09-23 21:35:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:26.580428 | orchestrator | 2025-09-23 21:35:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:29.623158 | orchestrator | 2025-09-23 21:35:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:29.623967 | orchestrator | 2025-09-23 21:35:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:29.624277 | orchestrator | 2025-09-23 21:35:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:32.666438 | orchestrator | 2025-09-23 21:35:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:32.668100 | orchestrator | 2025-09-23 21:35:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:32.668135 | orchestrator | 2025-09-23 21:35:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:35.711994 | orchestrator | 2025-09-23 21:35:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:35.714503 | orchestrator | 2025-09-23 21:35:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:35.714542 | orchestrator | 2025-09-23 21:35:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:38.753836 | orchestrator | 2025-09-23 21:35:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:38.756342 | orchestrator | 2025-09-23 21:35:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:38.756434 | orchestrator | 2025-09-23 21:35:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:41.800944 | orchestrator | 2025-09-23 21:35:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:41.802388 | orchestrator | 2025-09-23 21:35:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:41.802479 | orchestrator | 2025-09-23 21:35:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:44.847221 | orchestrator | 2025-09-23 21:35:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:44.848210 | orchestrator | 2025-09-23 21:35:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:44.848247 | orchestrator | 2025-09-23 21:35:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:47.893367 | orchestrator | 2025-09-23 21:35:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:47.894675 | orchestrator | 2025-09-23 21:35:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:47.895287 | orchestrator | 2025-09-23 21:35:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:50.941256 | orchestrator | 2025-09-23 21:35:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:50.942333 | orchestrator | 2025-09-23 21:35:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:50.942665 | orchestrator | 2025-09-23 21:35:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:53.988616 | orchestrator | 2025-09-23 21:35:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:53.990192 | orchestrator | 2025-09-23 21:35:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:53.990268 | orchestrator | 2025-09-23 21:35:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:35:57.037144 | orchestrator | 2025-09-23 21:35:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:35:57.038236 | orchestrator | 2025-09-23 21:35:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:35:57.038321 | orchestrator | 2025-09-23 21:35:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:00.081856 | orchestrator | 2025-09-23 21:36:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:00.083923 | orchestrator | 2025-09-23 21:36:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:00.084263 | orchestrator | 2025-09-23 21:36:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:03.126367 | orchestrator | 2025-09-23 21:36:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:03.126551 | orchestrator | 2025-09-23 21:36:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:03.126658 | orchestrator | 2025-09-23 21:36:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:06.175495 | orchestrator | 2025-09-23 21:36:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:06.177221 | orchestrator | 2025-09-23 21:36:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:06.177251 | orchestrator | 2025-09-23 21:36:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:09.222821 | orchestrator | 2025-09-23 21:36:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:09.224221 | orchestrator | 2025-09-23 21:36:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:09.224307 | orchestrator | 2025-09-23 21:36:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:12.269845 | orchestrator | 2025-09-23 21:36:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:12.271085 | orchestrator | 2025-09-23 21:36:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:12.271116 | orchestrator | 2025-09-23 21:36:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:15.315212 | orchestrator | 2025-09-23 21:36:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:15.316880 | orchestrator | 2025-09-23 21:36:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:15.316972 | orchestrator | 2025-09-23 21:36:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:18.362940 | orchestrator | 2025-09-23 21:36:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:18.364480 | orchestrator | 2025-09-23 21:36:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:18.364706 | orchestrator | 2025-09-23 21:36:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:21.407432 | orchestrator | 2025-09-23 21:36:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:21.408834 | orchestrator | 2025-09-23 21:36:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:21.408864 | orchestrator | 2025-09-23 21:36:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:24.455360 | orchestrator | 2025-09-23 21:36:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:24.457472 | orchestrator | 2025-09-23 21:36:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:24.457519 | orchestrator | 2025-09-23 21:36:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:27.510339 | orchestrator | 2025-09-23 21:36:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:27.511810 | orchestrator | 2025-09-23 21:36:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:27.511845 | orchestrator | 2025-09-23 21:36:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:30.552763 | orchestrator | 2025-09-23 21:36:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:30.554377 | orchestrator | 2025-09-23 21:36:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:30.554469 | orchestrator | 2025-09-23 21:36:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:33.596575 | orchestrator | 2025-09-23 21:36:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:33.598442 | orchestrator | 2025-09-23 21:36:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:33.598862 | orchestrator | 2025-09-23 21:36:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:36.640307 | orchestrator | 2025-09-23 21:36:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:36.641701 | orchestrator | 2025-09-23 21:36:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:36.641802 | orchestrator | 2025-09-23 21:36:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:39.687553 | orchestrator | 2025-09-23 21:36:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:39.689056 | orchestrator | 2025-09-23 21:36:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:39.689080 | orchestrator | 2025-09-23 21:36:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:42.731341 | orchestrator | 2025-09-23 21:36:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:42.732987 | orchestrator | 2025-09-23 21:36:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:42.733077 | orchestrator | 2025-09-23 21:36:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:45.774957 | orchestrator | 2025-09-23 21:36:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:45.777098 | orchestrator | 2025-09-23 21:36:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:45.777209 | orchestrator | 2025-09-23 21:36:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:48.823849 | orchestrator | 2025-09-23 21:36:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:48.825186 | orchestrator | 2025-09-23 21:36:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:48.825219 | orchestrator | 2025-09-23 21:36:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:51.868607 | orchestrator | 2025-09-23 21:36:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:51.869844 | orchestrator | 2025-09-23 21:36:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:51.869879 | orchestrator | 2025-09-23 21:36:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:54.919622 | orchestrator | 2025-09-23 21:36:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:54.923219 | orchestrator | 2025-09-23 21:36:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:54.924180 | orchestrator | 2025-09-23 21:36:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:36:57.967938 | orchestrator | 2025-09-23 21:36:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:36:57.970203 | orchestrator | 2025-09-23 21:36:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:36:57.970237 | orchestrator | 2025-09-23 21:36:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:01.014008 | orchestrator | 2025-09-23 21:37:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:01.015995 | orchestrator | 2025-09-23 21:37:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:01.016134 | orchestrator | 2025-09-23 21:37:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:04.059392 | orchestrator | 2025-09-23 21:37:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:04.060657 | orchestrator | 2025-09-23 21:37:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:04.060729 | orchestrator | 2025-09-23 21:37:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:07.103647 | orchestrator | 2025-09-23 21:37:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:07.104739 | orchestrator | 2025-09-23 21:37:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:07.104773 | orchestrator | 2025-09-23 21:37:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:10.144759 | orchestrator | 2025-09-23 21:37:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:10.147279 | orchestrator | 2025-09-23 21:37:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:10.147324 | orchestrator | 2025-09-23 21:37:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:13.192370 | orchestrator | 2025-09-23 21:37:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:13.194653 | orchestrator | 2025-09-23 21:37:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:13.194737 | orchestrator | 2025-09-23 21:37:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:16.240590 | orchestrator | 2025-09-23 21:37:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:16.241484 | orchestrator | 2025-09-23 21:37:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:16.241516 | orchestrator | 2025-09-23 21:37:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:19.286274 | orchestrator | 2025-09-23 21:37:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:19.288689 | orchestrator | 2025-09-23 21:37:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:19.288723 | orchestrator | 2025-09-23 21:37:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:22.327369 | orchestrator | 2025-09-23 21:37:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:22.328955 | orchestrator | 2025-09-23 21:37:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:22.328989 | orchestrator | 2025-09-23 21:37:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:25.378243 | orchestrator | 2025-09-23 21:37:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:25.380630 | orchestrator | 2025-09-23 21:37:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:25.380688 | orchestrator | 2025-09-23 21:37:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:28.426862 | orchestrator | 2025-09-23 21:37:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:28.428426 | orchestrator | 2025-09-23 21:37:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:28.428567 | orchestrator | 2025-09-23 21:37:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:31.473692 | orchestrator | 2025-09-23 21:37:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:31.475955 | orchestrator | 2025-09-23 21:37:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:31.476626 | orchestrator | 2025-09-23 21:37:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:34.518810 | orchestrator | 2025-09-23 21:37:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:34.521340 | orchestrator | 2025-09-23 21:37:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:34.521367 | orchestrator | 2025-09-23 21:37:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:37.566791 | orchestrator | 2025-09-23 21:37:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:37.567782 | orchestrator | 2025-09-23 21:37:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:37.567813 | orchestrator | 2025-09-23 21:37:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:40.613980 | orchestrator | 2025-09-23 21:37:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:40.615770 | orchestrator | 2025-09-23 21:37:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:40.615801 | orchestrator | 2025-09-23 21:37:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:43.665298 | orchestrator | 2025-09-23 21:37:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:43.666606 | orchestrator | 2025-09-23 21:37:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:43.666653 | orchestrator | 2025-09-23 21:37:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:46.711601 | orchestrator | 2025-09-23 21:37:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:46.713603 | orchestrator | 2025-09-23 21:37:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:46.713648 | orchestrator | 2025-09-23 21:37:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:49.753775 | orchestrator | 2025-09-23 21:37:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:49.756086 | orchestrator | 2025-09-23 21:37:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:49.756337 | orchestrator | 2025-09-23 21:37:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:52.801641 | orchestrator | 2025-09-23 21:37:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:52.802557 | orchestrator | 2025-09-23 21:37:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:52.802637 | orchestrator | 2025-09-23 21:37:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:55.848085 | orchestrator | 2025-09-23 21:37:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:55.851135 | orchestrator | 2025-09-23 21:37:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:55.851167 | orchestrator | 2025-09-23 21:37:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:37:58.899019 | orchestrator | 2025-09-23 21:37:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:37:58.900920 | orchestrator | 2025-09-23 21:37:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:37:58.900953 | orchestrator | 2025-09-23 21:37:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:01.945995 | orchestrator | 2025-09-23 21:38:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:01.947543 | orchestrator | 2025-09-23 21:38:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:01.947652 | orchestrator | 2025-09-23 21:38:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:05.002710 | orchestrator | 2025-09-23 21:38:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:05.005678 | orchestrator | 2025-09-23 21:38:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:05.005741 | orchestrator | 2025-09-23 21:38:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:08.053476 | orchestrator | 2025-09-23 21:38:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:08.054429 | orchestrator | 2025-09-23 21:38:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:08.054464 | orchestrator | 2025-09-23 21:38:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:11.092899 | orchestrator | 2025-09-23 21:38:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:11.094315 | orchestrator | 2025-09-23 21:38:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:11.094337 | orchestrator | 2025-09-23 21:38:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:14.138630 | orchestrator | 2025-09-23 21:38:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:14.140217 | orchestrator | 2025-09-23 21:38:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:14.140254 | orchestrator | 2025-09-23 21:38:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:17.185693 | orchestrator | 2025-09-23 21:38:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:17.187466 | orchestrator | 2025-09-23 21:38:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:17.187826 | orchestrator | 2025-09-23 21:38:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:20.235154 | orchestrator | 2025-09-23 21:38:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:20.237620 | orchestrator | 2025-09-23 21:38:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:20.237713 | orchestrator | 2025-09-23 21:38:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:23.282773 | orchestrator | 2025-09-23 21:38:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:23.283694 | orchestrator | 2025-09-23 21:38:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:23.283728 | orchestrator | 2025-09-23 21:38:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:26.328781 | orchestrator | 2025-09-23 21:38:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:26.329909 | orchestrator | 2025-09-23 21:38:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:26.329990 | orchestrator | 2025-09-23 21:38:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:29.375735 | orchestrator | 2025-09-23 21:38:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:29.377823 | orchestrator | 2025-09-23 21:38:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:29.377856 | orchestrator | 2025-09-23 21:38:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:32.425859 | orchestrator | 2025-09-23 21:38:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:32.427379 | orchestrator | 2025-09-23 21:38:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:32.427617 | orchestrator | 2025-09-23 21:38:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:35.470481 | orchestrator | 2025-09-23 21:38:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:35.471173 | orchestrator | 2025-09-23 21:38:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:35.471195 | orchestrator | 2025-09-23 21:38:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:38.517787 | orchestrator | 2025-09-23 21:38:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:38.519399 | orchestrator | 2025-09-23 21:38:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:38.519455 | orchestrator | 2025-09-23 21:38:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:41.561317 | orchestrator | 2025-09-23 21:38:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:41.563213 | orchestrator | 2025-09-23 21:38:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:41.563252 | orchestrator | 2025-09-23 21:38:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:44.606984 | orchestrator | 2025-09-23 21:38:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:44.608321 | orchestrator | 2025-09-23 21:38:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:44.608354 | orchestrator | 2025-09-23 21:38:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:47.656304 | orchestrator | 2025-09-23 21:38:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:47.657703 | orchestrator | 2025-09-23 21:38:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:47.657731 | orchestrator | 2025-09-23 21:38:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:50.705535 | orchestrator | 2025-09-23 21:38:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:50.707044 | orchestrator | 2025-09-23 21:38:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:50.707852 | orchestrator | 2025-09-23 21:38:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:53.755540 | orchestrator | 2025-09-23 21:38:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:53.756771 | orchestrator | 2025-09-23 21:38:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:53.756824 | orchestrator | 2025-09-23 21:38:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:56.799598 | orchestrator | 2025-09-23 21:38:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:56.802674 | orchestrator | 2025-09-23 21:38:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:56.802758 | orchestrator | 2025-09-23 21:38:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:38:59.845452 | orchestrator | 2025-09-23 21:38:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:38:59.847506 | orchestrator | 2025-09-23 21:38:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:38:59.847621 | orchestrator | 2025-09-23 21:38:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:02.892272 | orchestrator | 2025-09-23 21:39:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:02.893620 | orchestrator | 2025-09-23 21:39:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:02.893801 | orchestrator | 2025-09-23 21:39:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:05.940446 | orchestrator | 2025-09-23 21:39:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:05.942183 | orchestrator | 2025-09-23 21:39:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:05.942239 | orchestrator | 2025-09-23 21:39:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:08.987525 | orchestrator | 2025-09-23 21:39:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:08.989251 | orchestrator | 2025-09-23 21:39:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:08.989366 | orchestrator | 2025-09-23 21:39:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:12.033916 | orchestrator | 2025-09-23 21:39:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:12.035749 | orchestrator | 2025-09-23 21:39:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:12.035787 | orchestrator | 2025-09-23 21:39:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:15.080325 | orchestrator | 2025-09-23 21:39:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:15.082157 | orchestrator | 2025-09-23 21:39:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:15.082209 | orchestrator | 2025-09-23 21:39:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:18.122758 | orchestrator | 2025-09-23 21:39:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:18.124821 | orchestrator | 2025-09-23 21:39:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:18.124850 | orchestrator | 2025-09-23 21:39:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:21.170433 | orchestrator | 2025-09-23 21:39:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:21.171526 | orchestrator | 2025-09-23 21:39:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:21.171573 | orchestrator | 2025-09-23 21:39:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:24.206505 | orchestrator | 2025-09-23 21:39:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:24.207516 | orchestrator | 2025-09-23 21:39:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:24.207636 | orchestrator | 2025-09-23 21:39:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:27.249854 | orchestrator | 2025-09-23 21:39:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:27.251115 | orchestrator | 2025-09-23 21:39:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:27.251167 | orchestrator | 2025-09-23 21:39:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:30.294889 | orchestrator | 2025-09-23 21:39:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:30.296256 | orchestrator | 2025-09-23 21:39:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:30.296411 | orchestrator | 2025-09-23 21:39:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:33.341596 | orchestrator | 2025-09-23 21:39:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:33.343454 | orchestrator | 2025-09-23 21:39:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:33.343499 | orchestrator | 2025-09-23 21:39:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:36.381087 | orchestrator | 2025-09-23 21:39:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:36.381743 | orchestrator | 2025-09-23 21:39:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:36.381776 | orchestrator | 2025-09-23 21:39:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:39.429759 | orchestrator | 2025-09-23 21:39:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:39.431331 | orchestrator | 2025-09-23 21:39:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:39.431793 | orchestrator | 2025-09-23 21:39:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:42.472327 | orchestrator | 2025-09-23 21:39:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:42.474413 | orchestrator | 2025-09-23 21:39:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:42.474475 | orchestrator | 2025-09-23 21:39:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:45.520406 | orchestrator | 2025-09-23 21:39:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:45.522244 | orchestrator | 2025-09-23 21:39:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:45.522329 | orchestrator | 2025-09-23 21:39:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:48.564623 | orchestrator | 2025-09-23 21:39:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:48.567497 | orchestrator | 2025-09-23 21:39:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:48.567698 | orchestrator | 2025-09-23 21:39:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:51.614227 | orchestrator | 2025-09-23 21:39:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:51.615729 | orchestrator | 2025-09-23 21:39:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:51.615766 | orchestrator | 2025-09-23 21:39:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:54.660078 | orchestrator | 2025-09-23 21:39:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:54.661130 | orchestrator | 2025-09-23 21:39:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:54.661165 | orchestrator | 2025-09-23 21:39:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:39:57.703919 | orchestrator | 2025-09-23 21:39:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:39:57.705276 | orchestrator | 2025-09-23 21:39:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:39:57.705361 | orchestrator | 2025-09-23 21:39:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:00.750580 | orchestrator | 2025-09-23 21:40:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:00.752017 | orchestrator | 2025-09-23 21:40:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:00.752689 | orchestrator | 2025-09-23 21:40:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:03.796854 | orchestrator | 2025-09-23 21:40:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:03.798442 | orchestrator | 2025-09-23 21:40:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:03.798627 | orchestrator | 2025-09-23 21:40:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:06.839106 | orchestrator | 2025-09-23 21:40:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:06.840309 | orchestrator | 2025-09-23 21:40:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:06.840385 | orchestrator | 2025-09-23 21:40:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:09.880665 | orchestrator | 2025-09-23 21:40:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:09.882370 | orchestrator | 2025-09-23 21:40:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:09.882464 | orchestrator | 2025-09-23 21:40:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:12.924078 | orchestrator | 2025-09-23 21:40:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:12.926838 | orchestrator | 2025-09-23 21:40:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:12.926875 | orchestrator | 2025-09-23 21:40:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:15.977423 | orchestrator | 2025-09-23 21:40:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:15.979689 | orchestrator | 2025-09-23 21:40:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:15.980305 | orchestrator | 2025-09-23 21:40:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:19.023728 | orchestrator | 2025-09-23 21:40:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:19.024551 | orchestrator | 2025-09-23 21:40:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:19.024677 | orchestrator | 2025-09-23 21:40:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:22.067926 | orchestrator | 2025-09-23 21:40:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:22.069257 | orchestrator | 2025-09-23 21:40:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:22.069289 | orchestrator | 2025-09-23 21:40:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:25.112787 | orchestrator | 2025-09-23 21:40:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:25.113435 | orchestrator | 2025-09-23 21:40:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:25.113633 | orchestrator | 2025-09-23 21:40:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:28.158856 | orchestrator | 2025-09-23 21:40:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:28.160211 | orchestrator | 2025-09-23 21:40:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:28.160271 | orchestrator | 2025-09-23 21:40:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:31.206872 | orchestrator | 2025-09-23 21:40:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:31.208489 | orchestrator | 2025-09-23 21:40:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:31.208691 | orchestrator | 2025-09-23 21:40:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:34.255369 | orchestrator | 2025-09-23 21:40:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:34.256224 | orchestrator | 2025-09-23 21:40:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:34.256260 | orchestrator | 2025-09-23 21:40:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:37.296095 | orchestrator | 2025-09-23 21:40:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:37.299336 | orchestrator | 2025-09-23 21:40:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:37.299369 | orchestrator | 2025-09-23 21:40:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:40.345208 | orchestrator | 2025-09-23 21:40:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:40.346675 | orchestrator | 2025-09-23 21:40:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:40.346967 | orchestrator | 2025-09-23 21:40:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:43.391092 | orchestrator | 2025-09-23 21:40:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:43.394288 | orchestrator | 2025-09-23 21:40:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:43.394341 | orchestrator | 2025-09-23 21:40:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:46.439411 | orchestrator | 2025-09-23 21:40:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:46.440403 | orchestrator | 2025-09-23 21:40:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:46.440532 | orchestrator | 2025-09-23 21:40:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:49.486216 | orchestrator | 2025-09-23 21:40:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:49.487359 | orchestrator | 2025-09-23 21:40:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:49.487446 | orchestrator | 2025-09-23 21:40:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:52.537096 | orchestrator | 2025-09-23 21:40:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:52.539398 | orchestrator | 2025-09-23 21:40:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:52.539434 | orchestrator | 2025-09-23 21:40:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:55.582859 | orchestrator | 2025-09-23 21:40:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:55.585266 | orchestrator | 2025-09-23 21:40:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:55.585305 | orchestrator | 2025-09-23 21:40:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:40:58.633584 | orchestrator | 2025-09-23 21:40:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:40:58.634999 | orchestrator | 2025-09-23 21:40:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:40:58.635095 | orchestrator | 2025-09-23 21:40:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:01.682548 | orchestrator | 2025-09-23 21:41:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:01.684218 | orchestrator | 2025-09-23 21:41:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:01.684268 | orchestrator | 2025-09-23 21:41:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:04.724955 | orchestrator | 2025-09-23 21:41:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:04.726592 | orchestrator | 2025-09-23 21:41:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:04.726610 | orchestrator | 2025-09-23 21:41:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:07.770664 | orchestrator | 2025-09-23 21:41:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:07.771739 | orchestrator | 2025-09-23 21:41:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:07.771835 | orchestrator | 2025-09-23 21:41:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:10.816841 | orchestrator | 2025-09-23 21:41:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:10.818827 | orchestrator | 2025-09-23 21:41:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:10.818914 | orchestrator | 2025-09-23 21:41:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:13.865782 | orchestrator | 2025-09-23 21:41:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:13.868376 | orchestrator | 2025-09-23 21:41:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:13.868436 | orchestrator | 2025-09-23 21:41:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:16.914444 | orchestrator | 2025-09-23 21:41:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:16.916642 | orchestrator | 2025-09-23 21:41:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:16.916760 | orchestrator | 2025-09-23 21:41:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:19.958751 | orchestrator | 2025-09-23 21:41:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:19.960441 | orchestrator | 2025-09-23 21:41:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:19.960530 | orchestrator | 2025-09-23 21:41:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:23.002944 | orchestrator | 2025-09-23 21:41:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:23.004991 | orchestrator | 2025-09-23 21:41:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:23.005120 | orchestrator | 2025-09-23 21:41:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:26.043996 | orchestrator | 2025-09-23 21:41:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:26.045629 | orchestrator | 2025-09-23 21:41:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:26.045643 | orchestrator | 2025-09-23 21:41:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:29.096062 | orchestrator | 2025-09-23 21:41:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:29.098336 | orchestrator | 2025-09-23 21:41:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:29.098671 | orchestrator | 2025-09-23 21:41:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:32.147979 | orchestrator | 2025-09-23 21:41:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:32.149356 | orchestrator | 2025-09-23 21:41:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:32.149388 | orchestrator | 2025-09-23 21:41:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:35.196390 | orchestrator | 2025-09-23 21:41:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:35.198771 | orchestrator | 2025-09-23 21:41:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:35.198861 | orchestrator | 2025-09-23 21:41:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:38.242500 | orchestrator | 2025-09-23 21:41:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:38.244154 | orchestrator | 2025-09-23 21:41:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:38.244188 | orchestrator | 2025-09-23 21:41:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:41.288715 | orchestrator | 2025-09-23 21:41:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:41.289664 | orchestrator | 2025-09-23 21:41:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:41.289697 | orchestrator | 2025-09-23 21:41:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:44.334422 | orchestrator | 2025-09-23 21:41:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:44.335765 | orchestrator | 2025-09-23 21:41:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:44.336268 | orchestrator | 2025-09-23 21:41:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:47.385774 | orchestrator | 2025-09-23 21:41:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:47.387205 | orchestrator | 2025-09-23 21:41:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:47.387245 | orchestrator | 2025-09-23 21:41:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:50.434273 | orchestrator | 2025-09-23 21:41:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:50.435757 | orchestrator | 2025-09-23 21:41:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:50.435789 | orchestrator | 2025-09-23 21:41:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:53.487362 | orchestrator | 2025-09-23 21:41:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:53.489358 | orchestrator | 2025-09-23 21:41:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:53.489677 | orchestrator | 2025-09-23 21:41:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:56.531424 | orchestrator | 2025-09-23 21:41:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:56.533402 | orchestrator | 2025-09-23 21:41:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:56.533446 | orchestrator | 2025-09-23 21:41:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:41:59.581569 | orchestrator | 2025-09-23 21:41:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:41:59.583445 | orchestrator | 2025-09-23 21:41:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:41:59.583775 | orchestrator | 2025-09-23 21:41:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:02.630353 | orchestrator | 2025-09-23 21:42:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:02.632064 | orchestrator | 2025-09-23 21:42:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:02.632096 | orchestrator | 2025-09-23 21:42:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:05.674753 | orchestrator | 2025-09-23 21:42:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:05.675988 | orchestrator | 2025-09-23 21:42:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:05.676075 | orchestrator | 2025-09-23 21:42:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:08.721194 | orchestrator | 2025-09-23 21:42:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:08.722954 | orchestrator | 2025-09-23 21:42:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:08.723058 | orchestrator | 2025-09-23 21:42:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:11.768825 | orchestrator | 2025-09-23 21:42:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:11.771026 | orchestrator | 2025-09-23 21:42:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:11.771181 | orchestrator | 2025-09-23 21:42:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:14.815334 | orchestrator | 2025-09-23 21:42:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:14.816680 | orchestrator | 2025-09-23 21:42:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:14.816722 | orchestrator | 2025-09-23 21:42:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:17.859464 | orchestrator | 2025-09-23 21:42:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:17.860741 | orchestrator | 2025-09-23 21:42:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:17.860852 | orchestrator | 2025-09-23 21:42:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:20.904731 | orchestrator | 2025-09-23 21:42:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:20.906296 | orchestrator | 2025-09-23 21:42:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:20.906537 | orchestrator | 2025-09-23 21:42:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:23.951066 | orchestrator | 2025-09-23 21:42:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:23.954192 | orchestrator | 2025-09-23 21:42:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:23.954233 | orchestrator | 2025-09-23 21:42:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:27.002136 | orchestrator | 2025-09-23 21:42:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:27.004036 | orchestrator | 2025-09-23 21:42:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:27.004101 | orchestrator | 2025-09-23 21:42:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:30.048439 | orchestrator | 2025-09-23 21:42:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:30.051080 | orchestrator | 2025-09-23 21:42:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:30.051412 | orchestrator | 2025-09-23 21:42:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:33.094859 | orchestrator | 2025-09-23 21:42:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:33.095959 | orchestrator | 2025-09-23 21:42:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:33.096267 | orchestrator | 2025-09-23 21:42:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:36.142757 | orchestrator | 2025-09-23 21:42:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:36.148114 | orchestrator | 2025-09-23 21:42:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:36.148180 | orchestrator | 2025-09-23 21:42:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:39.191304 | orchestrator | 2025-09-23 21:42:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:39.192241 | orchestrator | 2025-09-23 21:42:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:39.192274 | orchestrator | 2025-09-23 21:42:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:42.235208 | orchestrator | 2025-09-23 21:42:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:42.236938 | orchestrator | 2025-09-23 21:42:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:42.237357 | orchestrator | 2025-09-23 21:42:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:45.278855 | orchestrator | 2025-09-23 21:42:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:45.279336 | orchestrator | 2025-09-23 21:42:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:45.279358 | orchestrator | 2025-09-23 21:42:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:48.329472 | orchestrator | 2025-09-23 21:42:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:48.331311 | orchestrator | 2025-09-23 21:42:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:48.331358 | orchestrator | 2025-09-23 21:42:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:51.376404 | orchestrator | 2025-09-23 21:42:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:51.377536 | orchestrator | 2025-09-23 21:42:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:51.377567 | orchestrator | 2025-09-23 21:42:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:54.426311 | orchestrator | 2025-09-23 21:42:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:54.427525 | orchestrator | 2025-09-23 21:42:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:54.427569 | orchestrator | 2025-09-23 21:42:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:42:57.466137 | orchestrator | 2025-09-23 21:42:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:42:57.467906 | orchestrator | 2025-09-23 21:42:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:42:57.468081 | orchestrator | 2025-09-23 21:42:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:00.506271 | orchestrator | 2025-09-23 21:43:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:00.507137 | orchestrator | 2025-09-23 21:43:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:00.507177 | orchestrator | 2025-09-23 21:43:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:03.557126 | orchestrator | 2025-09-23 21:43:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:03.560208 | orchestrator | 2025-09-23 21:43:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:03.560420 | orchestrator | 2025-09-23 21:43:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:06.605191 | orchestrator | 2025-09-23 21:43:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:06.606256 | orchestrator | 2025-09-23 21:43:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:06.606301 | orchestrator | 2025-09-23 21:43:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:09.647747 | orchestrator | 2025-09-23 21:43:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:09.649472 | orchestrator | 2025-09-23 21:43:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:09.649505 | orchestrator | 2025-09-23 21:43:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:12.689749 | orchestrator | 2025-09-23 21:43:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:12.691175 | orchestrator | 2025-09-23 21:43:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:12.691211 | orchestrator | 2025-09-23 21:43:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:15.733687 | orchestrator | 2025-09-23 21:43:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:15.734573 | orchestrator | 2025-09-23 21:43:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:15.734609 | orchestrator | 2025-09-23 21:43:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:18.777707 | orchestrator | 2025-09-23 21:43:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:18.779606 | orchestrator | 2025-09-23 21:43:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:18.779654 | orchestrator | 2025-09-23 21:43:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:21.823695 | orchestrator | 2025-09-23 21:43:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:21.825363 | orchestrator | 2025-09-23 21:43:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:21.825394 | orchestrator | 2025-09-23 21:43:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:24.871619 | orchestrator | 2025-09-23 21:43:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:24.873178 | orchestrator | 2025-09-23 21:43:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:24.873293 | orchestrator | 2025-09-23 21:43:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:27.919856 | orchestrator | 2025-09-23 21:43:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:27.921345 | orchestrator | 2025-09-23 21:43:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:27.921391 | orchestrator | 2025-09-23 21:43:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:30.967751 | orchestrator | 2025-09-23 21:43:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:30.968597 | orchestrator | 2025-09-23 21:43:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:30.968709 | orchestrator | 2025-09-23 21:43:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:34.011753 | orchestrator | 2025-09-23 21:43:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:34.013867 | orchestrator | 2025-09-23 21:43:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:34.014626 | orchestrator | 2025-09-23 21:43:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:37.054337 | orchestrator | 2025-09-23 21:43:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:37.056153 | orchestrator | 2025-09-23 21:43:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:37.056187 | orchestrator | 2025-09-23 21:43:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:40.097812 | orchestrator | 2025-09-23 21:43:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:40.101118 | orchestrator | 2025-09-23 21:43:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:40.101167 | orchestrator | 2025-09-23 21:43:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:43.147206 | orchestrator | 2025-09-23 21:43:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:43.150152 | orchestrator | 2025-09-23 21:43:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:43.150190 | orchestrator | 2025-09-23 21:43:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:46.198166 | orchestrator | 2025-09-23 21:43:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:46.199665 | orchestrator | 2025-09-23 21:43:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:46.199905 | orchestrator | 2025-09-23 21:43:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:49.241647 | orchestrator | 2025-09-23 21:43:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:49.244326 | orchestrator | 2025-09-23 21:43:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:49.244370 | orchestrator | 2025-09-23 21:43:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:52.283808 | orchestrator | 2025-09-23 21:43:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:52.285298 | orchestrator | 2025-09-23 21:43:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:52.285342 | orchestrator | 2025-09-23 21:43:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:55.323568 | orchestrator | 2025-09-23 21:43:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:55.323854 | orchestrator | 2025-09-23 21:43:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:55.323908 | orchestrator | 2025-09-23 21:43:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:43:58.370993 | orchestrator | 2025-09-23 21:43:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:43:58.372268 | orchestrator | 2025-09-23 21:43:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:43:58.372298 | orchestrator | 2025-09-23 21:43:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:01.415358 | orchestrator | 2025-09-23 21:44:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:01.417340 | orchestrator | 2025-09-23 21:44:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:01.417529 | orchestrator | 2025-09-23 21:44:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:04.459461 | orchestrator | 2025-09-23 21:44:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:04.460663 | orchestrator | 2025-09-23 21:44:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:04.460695 | orchestrator | 2025-09-23 21:44:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:07.508039 | orchestrator | 2025-09-23 21:44:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:07.509368 | orchestrator | 2025-09-23 21:44:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:07.509486 | orchestrator | 2025-09-23 21:44:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:10.550258 | orchestrator | 2025-09-23 21:44:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:10.550695 | orchestrator | 2025-09-23 21:44:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:10.550724 | orchestrator | 2025-09-23 21:44:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:13.599420 | orchestrator | 2025-09-23 21:44:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:13.601578 | orchestrator | 2025-09-23 21:44:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:13.601611 | orchestrator | 2025-09-23 21:44:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:16.646135 | orchestrator | 2025-09-23 21:44:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:16.647144 | orchestrator | 2025-09-23 21:44:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:16.647178 | orchestrator | 2025-09-23 21:44:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:19.689404 | orchestrator | 2025-09-23 21:44:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:19.691065 | orchestrator | 2025-09-23 21:44:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:19.691119 | orchestrator | 2025-09-23 21:44:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:22.738506 | orchestrator | 2025-09-23 21:44:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:22.740086 | orchestrator | 2025-09-23 21:44:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:22.740237 | orchestrator | 2025-09-23 21:44:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:25.785652 | orchestrator | 2025-09-23 21:44:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:25.786754 | orchestrator | 2025-09-23 21:44:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:25.787164 | orchestrator | 2025-09-23 21:44:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:28.830160 | orchestrator | 2025-09-23 21:44:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:28.832969 | orchestrator | 2025-09-23 21:44:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:28.833004 | orchestrator | 2025-09-23 21:44:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:31.875744 | orchestrator | 2025-09-23 21:44:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:31.880612 | orchestrator | 2025-09-23 21:44:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:31.880724 | orchestrator | 2025-09-23 21:44:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:34.927628 | orchestrator | 2025-09-23 21:44:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:34.928748 | orchestrator | 2025-09-23 21:44:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:34.928769 | orchestrator | 2025-09-23 21:44:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:37.971126 | orchestrator | 2025-09-23 21:44:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:37.972698 | orchestrator | 2025-09-23 21:44:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:37.972730 | orchestrator | 2025-09-23 21:44:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:41.017343 | orchestrator | 2025-09-23 21:44:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:41.018866 | orchestrator | 2025-09-23 21:44:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:41.018942 | orchestrator | 2025-09-23 21:44:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:44.061792 | orchestrator | 2025-09-23 21:44:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:44.064205 | orchestrator | 2025-09-23 21:44:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:44.064242 | orchestrator | 2025-09-23 21:44:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:47.112315 | orchestrator | 2025-09-23 21:44:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:47.114076 | orchestrator | 2025-09-23 21:44:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:47.114168 | orchestrator | 2025-09-23 21:44:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:50.157317 | orchestrator | 2025-09-23 21:44:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:50.158412 | orchestrator | 2025-09-23 21:44:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:50.158448 | orchestrator | 2025-09-23 21:44:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:53.203458 | orchestrator | 2025-09-23 21:44:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:53.205435 | orchestrator | 2025-09-23 21:44:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:53.205565 | orchestrator | 2025-09-23 21:44:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:56.254390 | orchestrator | 2025-09-23 21:44:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:56.256282 | orchestrator | 2025-09-23 21:44:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:56.256363 | orchestrator | 2025-09-23 21:44:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:44:59.302667 | orchestrator | 2025-09-23 21:44:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:44:59.305116 | orchestrator | 2025-09-23 21:44:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:44:59.305172 | orchestrator | 2025-09-23 21:44:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:02.351361 | orchestrator | 2025-09-23 21:45:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:02.352982 | orchestrator | 2025-09-23 21:45:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:02.353019 | orchestrator | 2025-09-23 21:45:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:05.390462 | orchestrator | 2025-09-23 21:45:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:05.392191 | orchestrator | 2025-09-23 21:45:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:05.392236 | orchestrator | 2025-09-23 21:45:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:08.427321 | orchestrator | 2025-09-23 21:45:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:08.430739 | orchestrator | 2025-09-23 21:45:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:08.430781 | orchestrator | 2025-09-23 21:45:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:11.476800 | orchestrator | 2025-09-23 21:45:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:11.479281 | orchestrator | 2025-09-23 21:45:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:11.479318 | orchestrator | 2025-09-23 21:45:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:14.520273 | orchestrator | 2025-09-23 21:45:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:14.521542 | orchestrator | 2025-09-23 21:45:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:14.521593 | orchestrator | 2025-09-23 21:45:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:17.566796 | orchestrator | 2025-09-23 21:45:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:17.568138 | orchestrator | 2025-09-23 21:45:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:17.568221 | orchestrator | 2025-09-23 21:45:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:20.608758 | orchestrator | 2025-09-23 21:45:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:20.611048 | orchestrator | 2025-09-23 21:45:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:20.611504 | orchestrator | 2025-09-23 21:45:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:23.657095 | orchestrator | 2025-09-23 21:45:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:23.659456 | orchestrator | 2025-09-23 21:45:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:23.659783 | orchestrator | 2025-09-23 21:45:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:26.704422 | orchestrator | 2025-09-23 21:45:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:26.705169 | orchestrator | 2025-09-23 21:45:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:26.705206 | orchestrator | 2025-09-23 21:45:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:29.751400 | orchestrator | 2025-09-23 21:45:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:29.753721 | orchestrator | 2025-09-23 21:45:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:29.753835 | orchestrator | 2025-09-23 21:45:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:32.801849 | orchestrator | 2025-09-23 21:45:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:32.803824 | orchestrator | 2025-09-23 21:45:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:32.803916 | orchestrator | 2025-09-23 21:45:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:35.845217 | orchestrator | 2025-09-23 21:45:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:35.846741 | orchestrator | 2025-09-23 21:45:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:35.846756 | orchestrator | 2025-09-23 21:45:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:38.891361 | orchestrator | 2025-09-23 21:45:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:38.893031 | orchestrator | 2025-09-23 21:45:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:38.893073 | orchestrator | 2025-09-23 21:45:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:41.943140 | orchestrator | 2025-09-23 21:45:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:41.944450 | orchestrator | 2025-09-23 21:45:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:41.945020 | orchestrator | 2025-09-23 21:45:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:44.987275 | orchestrator | 2025-09-23 21:45:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:44.989318 | orchestrator | 2025-09-23 21:45:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:44.989460 | orchestrator | 2025-09-23 21:45:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:48.029664 | orchestrator | 2025-09-23 21:45:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:48.031650 | orchestrator | 2025-09-23 21:45:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:48.031687 | orchestrator | 2025-09-23 21:45:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:51.084209 | orchestrator | 2025-09-23 21:45:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:51.085215 | orchestrator | 2025-09-23 21:45:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:51.085248 | orchestrator | 2025-09-23 21:45:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:54.130325 | orchestrator | 2025-09-23 21:45:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:54.131289 | orchestrator | 2025-09-23 21:45:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:54.131321 | orchestrator | 2025-09-23 21:45:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:45:57.187582 | orchestrator | 2025-09-23 21:45:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:45:57.189927 | orchestrator | 2025-09-23 21:45:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:45:57.189981 | orchestrator | 2025-09-23 21:45:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:00.233517 | orchestrator | 2025-09-23 21:46:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:00.235252 | orchestrator | 2025-09-23 21:46:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:00.235589 | orchestrator | 2025-09-23 21:46:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:03.285624 | orchestrator | 2025-09-23 21:46:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:03.287746 | orchestrator | 2025-09-23 21:46:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:03.287927 | orchestrator | 2025-09-23 21:46:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:06.344397 | orchestrator | 2025-09-23 21:46:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:06.345994 | orchestrator | 2025-09-23 21:46:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:06.346050 | orchestrator | 2025-09-23 21:46:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:09.399150 | orchestrator | 2025-09-23 21:46:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:09.402435 | orchestrator | 2025-09-23 21:46:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:09.402613 | orchestrator | 2025-09-23 21:46:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:12.455383 | orchestrator | 2025-09-23 21:46:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:12.457349 | orchestrator | 2025-09-23 21:46:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:12.457421 | orchestrator | 2025-09-23 21:46:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:15.504593 | orchestrator | 2025-09-23 21:46:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:15.507874 | orchestrator | 2025-09-23 21:46:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:15.507918 | orchestrator | 2025-09-23 21:46:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:18.556331 | orchestrator | 2025-09-23 21:46:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:18.559628 | orchestrator | 2025-09-23 21:46:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:18.560189 | orchestrator | 2025-09-23 21:46:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:21.606224 | orchestrator | 2025-09-23 21:46:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:21.607730 | orchestrator | 2025-09-23 21:46:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:21.607759 | orchestrator | 2025-09-23 21:46:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:24.655259 | orchestrator | 2025-09-23 21:46:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:24.657601 | orchestrator | 2025-09-23 21:46:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:24.657631 | orchestrator | 2025-09-23 21:46:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:27.707396 | orchestrator | 2025-09-23 21:46:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:27.709318 | orchestrator | 2025-09-23 21:46:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:27.709405 | orchestrator | 2025-09-23 21:46:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:30.750422 | orchestrator | 2025-09-23 21:46:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:30.751913 | orchestrator | 2025-09-23 21:46:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:30.752027 | orchestrator | 2025-09-23 21:46:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:33.795727 | orchestrator | 2025-09-23 21:46:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:33.798216 | orchestrator | 2025-09-23 21:46:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:33.798274 | orchestrator | 2025-09-23 21:46:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:36.842260 | orchestrator | 2025-09-23 21:46:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:36.844103 | orchestrator | 2025-09-23 21:46:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:36.844487 | orchestrator | 2025-09-23 21:46:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:39.893387 | orchestrator | 2025-09-23 21:46:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:39.895168 | orchestrator | 2025-09-23 21:46:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:39.895203 | orchestrator | 2025-09-23 21:46:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:42.940608 | orchestrator | 2025-09-23 21:46:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:42.941724 | orchestrator | 2025-09-23 21:46:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:42.941871 | orchestrator | 2025-09-23 21:46:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:45.983324 | orchestrator | 2025-09-23 21:46:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:45.984324 | orchestrator | 2025-09-23 21:46:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:45.984353 | orchestrator | 2025-09-23 21:46:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:49.028938 | orchestrator | 2025-09-23 21:46:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:49.029582 | orchestrator | 2025-09-23 21:46:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:49.029955 | orchestrator | 2025-09-23 21:46:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:52.074170 | orchestrator | 2025-09-23 21:46:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:52.075817 | orchestrator | 2025-09-23 21:46:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:52.075887 | orchestrator | 2025-09-23 21:46:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:55.126297 | orchestrator | 2025-09-23 21:46:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:55.127241 | orchestrator | 2025-09-23 21:46:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:55.127276 | orchestrator | 2025-09-23 21:46:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:46:58.174495 | orchestrator | 2025-09-23 21:46:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:46:58.176140 | orchestrator | 2025-09-23 21:46:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:46:58.176341 | orchestrator | 2025-09-23 21:46:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:01.219407 | orchestrator | 2025-09-23 21:47:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:01.220236 | orchestrator | 2025-09-23 21:47:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:01.220266 | orchestrator | 2025-09-23 21:47:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:04.261769 | orchestrator | 2025-09-23 21:47:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:04.263839 | orchestrator | 2025-09-23 21:47:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:04.263895 | orchestrator | 2025-09-23 21:47:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:07.309262 | orchestrator | 2025-09-23 21:47:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:07.310932 | orchestrator | 2025-09-23 21:47:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:07.311158 | orchestrator | 2025-09-23 21:47:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:10.357453 | orchestrator | 2025-09-23 21:47:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:10.360550 | orchestrator | 2025-09-23 21:47:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:10.360599 | orchestrator | 2025-09-23 21:47:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:13.411234 | orchestrator | 2025-09-23 21:47:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:13.415222 | orchestrator | 2025-09-23 21:47:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:13.415355 | orchestrator | 2025-09-23 21:47:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:16.463548 | orchestrator | 2025-09-23 21:47:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:16.465495 | orchestrator | 2025-09-23 21:47:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:16.465534 | orchestrator | 2025-09-23 21:47:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:19.513344 | orchestrator | 2025-09-23 21:47:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:19.515754 | orchestrator | 2025-09-23 21:47:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:19.515987 | orchestrator | 2025-09-23 21:47:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:22.561879 | orchestrator | 2025-09-23 21:47:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:22.564514 | orchestrator | 2025-09-23 21:47:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:22.564800 | orchestrator | 2025-09-23 21:47:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:25.609246 | orchestrator | 2025-09-23 21:47:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:25.610719 | orchestrator | 2025-09-23 21:47:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:25.610794 | orchestrator | 2025-09-23 21:47:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:28.658130 | orchestrator | 2025-09-23 21:47:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:28.659678 | orchestrator | 2025-09-23 21:47:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:28.659721 | orchestrator | 2025-09-23 21:47:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:31.704063 | orchestrator | 2025-09-23 21:47:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:31.705406 | orchestrator | 2025-09-23 21:47:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:31.705435 | orchestrator | 2025-09-23 21:47:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:34.746992 | orchestrator | 2025-09-23 21:47:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:34.749751 | orchestrator | 2025-09-23 21:47:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:34.749801 | orchestrator | 2025-09-23 21:47:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:37.798135 | orchestrator | 2025-09-23 21:47:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:37.799942 | orchestrator | 2025-09-23 21:47:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:37.799974 | orchestrator | 2025-09-23 21:47:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:40.844148 | orchestrator | 2025-09-23 21:47:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:40.846335 | orchestrator | 2025-09-23 21:47:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:40.846383 | orchestrator | 2025-09-23 21:47:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:43.892229 | orchestrator | 2025-09-23 21:47:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:43.894270 | orchestrator | 2025-09-23 21:47:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:43.894303 | orchestrator | 2025-09-23 21:47:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:46.936259 | orchestrator | 2025-09-23 21:47:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:46.938730 | orchestrator | 2025-09-23 21:47:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:46.938810 | orchestrator | 2025-09-23 21:47:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:49.983150 | orchestrator | 2025-09-23 21:47:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:49.984786 | orchestrator | 2025-09-23 21:47:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:49.985183 | orchestrator | 2025-09-23 21:47:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:53.033249 | orchestrator | 2025-09-23 21:47:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:53.034482 | orchestrator | 2025-09-23 21:47:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:53.034518 | orchestrator | 2025-09-23 21:47:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:56.079026 | orchestrator | 2025-09-23 21:47:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:56.080229 | orchestrator | 2025-09-23 21:47:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:56.080259 | orchestrator | 2025-09-23 21:47:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:47:59.124229 | orchestrator | 2025-09-23 21:47:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:47:59.127299 | orchestrator | 2025-09-23 21:47:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:47:59.127341 | orchestrator | 2025-09-23 21:47:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:02.171098 | orchestrator | 2025-09-23 21:48:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:02.171994 | orchestrator | 2025-09-23 21:48:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:02.172134 | orchestrator | 2025-09-23 21:48:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:05.213092 | orchestrator | 2025-09-23 21:48:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:05.214865 | orchestrator | 2025-09-23 21:48:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:05.214924 | orchestrator | 2025-09-23 21:48:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:08.257532 | orchestrator | 2025-09-23 21:48:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:08.259373 | orchestrator | 2025-09-23 21:48:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:08.259428 | orchestrator | 2025-09-23 21:48:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:11.300182 | orchestrator | 2025-09-23 21:48:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:11.302364 | orchestrator | 2025-09-23 21:48:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:11.302433 | orchestrator | 2025-09-23 21:48:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:14.350581 | orchestrator | 2025-09-23 21:48:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:14.352213 | orchestrator | 2025-09-23 21:48:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:14.352241 | orchestrator | 2025-09-23 21:48:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:17.395366 | orchestrator | 2025-09-23 21:48:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:17.398427 | orchestrator | 2025-09-23 21:48:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:17.398619 | orchestrator | 2025-09-23 21:48:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:20.436923 | orchestrator | 2025-09-23 21:48:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:20.438393 | orchestrator | 2025-09-23 21:48:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:20.438555 | orchestrator | 2025-09-23 21:48:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:23.483983 | orchestrator | 2025-09-23 21:48:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:23.485168 | orchestrator | 2025-09-23 21:48:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:23.485236 | orchestrator | 2025-09-23 21:48:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:26.528315 | orchestrator | 2025-09-23 21:48:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:26.529784 | orchestrator | 2025-09-23 21:48:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:26.529825 | orchestrator | 2025-09-23 21:48:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:29.580087 | orchestrator | 2025-09-23 21:48:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:29.581301 | orchestrator | 2025-09-23 21:48:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:29.581330 | orchestrator | 2025-09-23 21:48:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:32.624561 | orchestrator | 2025-09-23 21:48:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:32.626541 | orchestrator | 2025-09-23 21:48:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:32.626583 | orchestrator | 2025-09-23 21:48:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:35.670585 | orchestrator | 2025-09-23 21:48:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:35.672294 | orchestrator | 2025-09-23 21:48:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:35.672469 | orchestrator | 2025-09-23 21:48:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:38.713976 | orchestrator | 2025-09-23 21:48:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:38.715536 | orchestrator | 2025-09-23 21:48:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:38.715567 | orchestrator | 2025-09-23 21:48:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:41.765871 | orchestrator | 2025-09-23 21:48:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:41.766898 | orchestrator | 2025-09-23 21:48:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:41.766933 | orchestrator | 2025-09-23 21:48:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:44.809674 | orchestrator | 2025-09-23 21:48:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:44.810803 | orchestrator | 2025-09-23 21:48:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:44.810973 | orchestrator | 2025-09-23 21:48:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:47.857009 | orchestrator | 2025-09-23 21:48:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:47.859475 | orchestrator | 2025-09-23 21:48:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:47.860026 | orchestrator | 2025-09-23 21:48:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:50.905232 | orchestrator | 2025-09-23 21:48:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:50.906483 | orchestrator | 2025-09-23 21:48:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:50.906526 | orchestrator | 2025-09-23 21:48:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:53.943656 | orchestrator | 2025-09-23 21:48:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:53.945523 | orchestrator | 2025-09-23 21:48:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:53.945612 | orchestrator | 2025-09-23 21:48:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:48:56.993157 | orchestrator | 2025-09-23 21:48:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:48:56.995232 | orchestrator | 2025-09-23 21:48:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:48:56.995266 | orchestrator | 2025-09-23 21:48:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:00.040858 | orchestrator | 2025-09-23 21:49:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:00.042550 | orchestrator | 2025-09-23 21:49:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:00.042689 | orchestrator | 2025-09-23 21:49:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:03.086650 | orchestrator | 2025-09-23 21:49:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:03.089585 | orchestrator | 2025-09-23 21:49:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:03.090257 | orchestrator | 2025-09-23 21:49:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:06.135194 | orchestrator | 2025-09-23 21:49:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:06.136568 | orchestrator | 2025-09-23 21:49:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:06.136597 | orchestrator | 2025-09-23 21:49:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:09.179111 | orchestrator | 2025-09-23 21:49:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:09.181018 | orchestrator | 2025-09-23 21:49:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:09.181062 | orchestrator | 2025-09-23 21:49:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:12.225658 | orchestrator | 2025-09-23 21:49:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:12.227263 | orchestrator | 2025-09-23 21:49:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:12.227348 | orchestrator | 2025-09-23 21:49:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:15.270159 | orchestrator | 2025-09-23 21:49:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:15.270846 | orchestrator | 2025-09-23 21:49:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:15.270910 | orchestrator | 2025-09-23 21:49:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:18.314862 | orchestrator | 2025-09-23 21:49:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:18.315993 | orchestrator | 2025-09-23 21:49:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:18.316334 | orchestrator | 2025-09-23 21:49:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:21.360327 | orchestrator | 2025-09-23 21:49:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:21.361191 | orchestrator | 2025-09-23 21:49:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:21.361220 | orchestrator | 2025-09-23 21:49:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:24.416293 | orchestrator | 2025-09-23 21:49:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:24.417750 | orchestrator | 2025-09-23 21:49:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:24.417801 | orchestrator | 2025-09-23 21:49:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:27.466748 | orchestrator | 2025-09-23 21:49:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:27.468900 | orchestrator | 2025-09-23 21:49:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:27.468961 | orchestrator | 2025-09-23 21:49:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:30.513364 | orchestrator | 2025-09-23 21:49:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:30.513472 | orchestrator | 2025-09-23 21:49:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:30.513487 | orchestrator | 2025-09-23 21:49:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:33.555415 | orchestrator | 2025-09-23 21:49:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:33.557581 | orchestrator | 2025-09-23 21:49:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:33.557753 | orchestrator | 2025-09-23 21:49:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:36.604353 | orchestrator | 2025-09-23 21:49:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:36.605268 | orchestrator | 2025-09-23 21:49:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:36.605298 | orchestrator | 2025-09-23 21:49:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:39.648484 | orchestrator | 2025-09-23 21:49:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:39.650295 | orchestrator | 2025-09-23 21:49:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:39.650492 | orchestrator | 2025-09-23 21:49:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:42.697224 | orchestrator | 2025-09-23 21:49:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:42.698208 | orchestrator | 2025-09-23 21:49:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:42.698542 | orchestrator | 2025-09-23 21:49:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:45.744284 | orchestrator | 2025-09-23 21:49:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:45.746325 | orchestrator | 2025-09-23 21:49:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:45.746447 | orchestrator | 2025-09-23 21:49:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:48.790169 | orchestrator | 2025-09-23 21:49:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:48.792180 | orchestrator | 2025-09-23 21:49:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:48.792240 | orchestrator | 2025-09-23 21:49:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:51.840258 | orchestrator | 2025-09-23 21:49:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:51.841529 | orchestrator | 2025-09-23 21:49:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:51.841745 | orchestrator | 2025-09-23 21:49:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:54.886463 | orchestrator | 2025-09-23 21:49:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:54.888769 | orchestrator | 2025-09-23 21:49:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:54.888858 | orchestrator | 2025-09-23 21:49:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:49:57.937530 | orchestrator | 2025-09-23 21:49:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:49:57.939240 | orchestrator | 2025-09-23 21:49:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:49:57.939460 | orchestrator | 2025-09-23 21:49:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:00.982844 | orchestrator | 2025-09-23 21:50:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:00.984593 | orchestrator | 2025-09-23 21:50:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:00.984689 | orchestrator | 2025-09-23 21:50:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:04.031872 | orchestrator | 2025-09-23 21:50:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:04.033327 | orchestrator | 2025-09-23 21:50:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:04.033447 | orchestrator | 2025-09-23 21:50:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:07.069423 | orchestrator | 2025-09-23 21:50:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:07.071388 | orchestrator | 2025-09-23 21:50:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:07.071419 | orchestrator | 2025-09-23 21:50:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:10.112422 | orchestrator | 2025-09-23 21:50:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:10.113783 | orchestrator | 2025-09-23 21:50:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:10.113878 | orchestrator | 2025-09-23 21:50:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:13.152778 | orchestrator | 2025-09-23 21:50:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:13.154917 | orchestrator | 2025-09-23 21:50:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:13.154971 | orchestrator | 2025-09-23 21:50:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:16.197746 | orchestrator | 2025-09-23 21:50:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:16.198333 | orchestrator | 2025-09-23 21:50:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:16.198445 | orchestrator | 2025-09-23 21:50:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:19.243851 | orchestrator | 2025-09-23 21:50:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:19.245234 | orchestrator | 2025-09-23 21:50:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:19.245269 | orchestrator | 2025-09-23 21:50:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:22.291344 | orchestrator | 2025-09-23 21:50:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:22.292536 | orchestrator | 2025-09-23 21:50:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:22.292709 | orchestrator | 2025-09-23 21:50:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:25.338541 | orchestrator | 2025-09-23 21:50:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:25.339504 | orchestrator | 2025-09-23 21:50:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:25.339529 | orchestrator | 2025-09-23 21:50:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:28.390175 | orchestrator | 2025-09-23 21:50:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:28.392451 | orchestrator | 2025-09-23 21:50:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:28.392488 | orchestrator | 2025-09-23 21:50:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:31.431614 | orchestrator | 2025-09-23 21:50:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:31.433620 | orchestrator | 2025-09-23 21:50:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:31.433752 | orchestrator | 2025-09-23 21:50:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:34.480984 | orchestrator | 2025-09-23 21:50:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:34.484552 | orchestrator | 2025-09-23 21:50:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:34.484698 | orchestrator | 2025-09-23 21:50:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:37.530880 | orchestrator | 2025-09-23 21:50:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:37.534344 | orchestrator | 2025-09-23 21:50:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:37.534802 | orchestrator | 2025-09-23 21:50:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:40.577170 | orchestrator | 2025-09-23 21:50:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:40.578852 | orchestrator | 2025-09-23 21:50:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:40.579006 | orchestrator | 2025-09-23 21:50:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:43.623917 | orchestrator | 2025-09-23 21:50:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:43.625318 | orchestrator | 2025-09-23 21:50:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:43.625351 | orchestrator | 2025-09-23 21:50:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:46.669248 | orchestrator | 2025-09-23 21:50:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:46.670249 | orchestrator | 2025-09-23 21:50:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:46.670309 | orchestrator | 2025-09-23 21:50:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:49.717165 | orchestrator | 2025-09-23 21:50:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:49.718194 | orchestrator | 2025-09-23 21:50:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:49.718420 | orchestrator | 2025-09-23 21:50:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:52.764493 | orchestrator | 2025-09-23 21:50:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:52.766588 | orchestrator | 2025-09-23 21:50:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:52.766712 | orchestrator | 2025-09-23 21:50:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:55.807606 | orchestrator | 2025-09-23 21:50:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:55.808892 | orchestrator | 2025-09-23 21:50:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:55.808924 | orchestrator | 2025-09-23 21:50:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:50:58.853275 | orchestrator | 2025-09-23 21:50:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:50:58.855210 | orchestrator | 2025-09-23 21:50:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:50:58.855252 | orchestrator | 2025-09-23 21:50:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:01.902305 | orchestrator | 2025-09-23 21:51:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:01.903822 | orchestrator | 2025-09-23 21:51:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:01.903853 | orchestrator | 2025-09-23 21:51:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:04.951442 | orchestrator | 2025-09-23 21:51:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:04.953713 | orchestrator | 2025-09-23 21:51:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:04.953753 | orchestrator | 2025-09-23 21:51:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:07.999230 | orchestrator | 2025-09-23 21:51:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:07.999864 | orchestrator | 2025-09-23 21:51:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:07.999897 | orchestrator | 2025-09-23 21:51:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:11.045252 | orchestrator | 2025-09-23 21:51:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:11.046171 | orchestrator | 2025-09-23 21:51:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:11.046263 | orchestrator | 2025-09-23 21:51:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:14.089388 | orchestrator | 2025-09-23 21:51:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:14.090920 | orchestrator | 2025-09-23 21:51:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:14.090952 | orchestrator | 2025-09-23 21:51:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:17.134734 | orchestrator | 2025-09-23 21:51:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:17.137108 | orchestrator | 2025-09-23 21:51:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:17.137264 | orchestrator | 2025-09-23 21:51:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:20.178984 | orchestrator | 2025-09-23 21:51:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:20.180529 | orchestrator | 2025-09-23 21:51:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:20.180569 | orchestrator | 2025-09-23 21:51:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:23.226279 | orchestrator | 2025-09-23 21:51:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:23.227746 | orchestrator | 2025-09-23 21:51:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:23.227777 | orchestrator | 2025-09-23 21:51:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:26.265979 | orchestrator | 2025-09-23 21:51:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:26.267669 | orchestrator | 2025-09-23 21:51:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:26.267705 | orchestrator | 2025-09-23 21:51:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:29.311949 | orchestrator | 2025-09-23 21:51:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:29.313407 | orchestrator | 2025-09-23 21:51:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:29.313456 | orchestrator | 2025-09-23 21:51:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:32.358790 | orchestrator | 2025-09-23 21:51:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:32.360981 | orchestrator | 2025-09-23 21:51:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:32.361011 | orchestrator | 2025-09-23 21:51:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:35.405903 | orchestrator | 2025-09-23 21:51:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:35.406723 | orchestrator | 2025-09-23 21:51:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:35.407052 | orchestrator | 2025-09-23 21:51:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:38.446216 | orchestrator | 2025-09-23 21:51:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:38.447780 | orchestrator | 2025-09-23 21:51:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:38.447963 | orchestrator | 2025-09-23 21:51:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:41.494137 | orchestrator | 2025-09-23 21:51:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:41.495710 | orchestrator | 2025-09-23 21:51:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:41.495868 | orchestrator | 2025-09-23 21:51:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:44.537517 | orchestrator | 2025-09-23 21:51:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:44.539979 | orchestrator | 2025-09-23 21:51:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:44.540066 | orchestrator | 2025-09-23 21:51:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:47.584785 | orchestrator | 2025-09-23 21:51:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:47.585769 | orchestrator | 2025-09-23 21:51:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:47.585822 | orchestrator | 2025-09-23 21:51:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:50.632157 | orchestrator | 2025-09-23 21:51:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:50.634376 | orchestrator | 2025-09-23 21:51:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:50.634531 | orchestrator | 2025-09-23 21:51:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:53.682874 | orchestrator | 2025-09-23 21:51:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:53.685405 | orchestrator | 2025-09-23 21:51:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:53.685502 | orchestrator | 2025-09-23 21:51:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:56.733264 | orchestrator | 2025-09-23 21:51:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:56.734209 | orchestrator | 2025-09-23 21:51:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:56.734254 | orchestrator | 2025-09-23 21:51:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:51:59.781499 | orchestrator | 2025-09-23 21:51:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:51:59.781705 | orchestrator | 2025-09-23 21:51:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:51:59.781728 | orchestrator | 2025-09-23 21:51:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:02.825429 | orchestrator | 2025-09-23 21:52:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:02.826969 | orchestrator | 2025-09-23 21:52:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:02.827055 | orchestrator | 2025-09-23 21:52:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:05.873143 | orchestrator | 2025-09-23 21:52:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:05.873925 | orchestrator | 2025-09-23 21:52:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:05.874226 | orchestrator | 2025-09-23 21:52:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:08.919269 | orchestrator | 2025-09-23 21:52:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:08.921000 | orchestrator | 2025-09-23 21:52:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:08.921031 | orchestrator | 2025-09-23 21:52:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:11.959151 | orchestrator | 2025-09-23 21:52:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:11.959974 | orchestrator | 2025-09-23 21:52:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:11.960087 | orchestrator | 2025-09-23 21:52:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:15.005268 | orchestrator | 2025-09-23 21:52:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:15.008347 | orchestrator | 2025-09-23 21:52:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:15.008385 | orchestrator | 2025-09-23 21:52:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:18.050995 | orchestrator | 2025-09-23 21:52:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:18.053395 | orchestrator | 2025-09-23 21:52:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:18.053425 | orchestrator | 2025-09-23 21:52:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:21.100277 | orchestrator | 2025-09-23 21:52:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:21.101811 | orchestrator | 2025-09-23 21:52:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:21.101844 | orchestrator | 2025-09-23 21:52:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:24.149033 | orchestrator | 2025-09-23 21:52:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:24.149544 | orchestrator | 2025-09-23 21:52:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:24.149572 | orchestrator | 2025-09-23 21:52:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:27.191688 | orchestrator | 2025-09-23 21:52:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:27.194188 | orchestrator | 2025-09-23 21:52:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:27.194309 | orchestrator | 2025-09-23 21:52:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:30.240095 | orchestrator | 2025-09-23 21:52:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:30.241208 | orchestrator | 2025-09-23 21:52:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:30.241255 | orchestrator | 2025-09-23 21:52:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:33.287470 | orchestrator | 2025-09-23 21:52:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:33.289436 | orchestrator | 2025-09-23 21:52:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:33.289526 | orchestrator | 2025-09-23 21:52:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:36.334141 | orchestrator | 2025-09-23 21:52:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:36.335317 | orchestrator | 2025-09-23 21:52:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:36.335398 | orchestrator | 2025-09-23 21:52:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:39.378814 | orchestrator | 2025-09-23 21:52:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:39.379019 | orchestrator | 2025-09-23 21:52:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:39.379044 | orchestrator | 2025-09-23 21:52:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:42.423164 | orchestrator | 2025-09-23 21:52:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:42.423997 | orchestrator | 2025-09-23 21:52:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:42.424030 | orchestrator | 2025-09-23 21:52:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:45.471130 | orchestrator | 2025-09-23 21:52:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:45.471233 | orchestrator | 2025-09-23 21:52:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:45.471248 | orchestrator | 2025-09-23 21:52:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:48.518736 | orchestrator | 2025-09-23 21:52:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:48.519529 | orchestrator | 2025-09-23 21:52:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:48.519560 | orchestrator | 2025-09-23 21:52:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:51.568728 | orchestrator | 2025-09-23 21:52:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:51.570184 | orchestrator | 2025-09-23 21:52:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:51.570291 | orchestrator | 2025-09-23 21:52:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:54.621091 | orchestrator | 2025-09-23 21:52:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:54.622718 | orchestrator | 2025-09-23 21:52:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:54.622758 | orchestrator | 2025-09-23 21:52:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:52:57.673750 | orchestrator | 2025-09-23 21:52:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:52:57.675431 | orchestrator | 2025-09-23 21:52:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:52:57.676033 | orchestrator | 2025-09-23 21:52:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:00.720547 | orchestrator | 2025-09-23 21:53:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:00.722383 | orchestrator | 2025-09-23 21:53:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:00.722412 | orchestrator | 2025-09-23 21:53:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:03.770467 | orchestrator | 2025-09-23 21:53:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:03.772249 | orchestrator | 2025-09-23 21:53:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:03.772339 | orchestrator | 2025-09-23 21:53:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:06.816232 | orchestrator | 2025-09-23 21:53:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:06.817486 | orchestrator | 2025-09-23 21:53:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:06.817514 | orchestrator | 2025-09-23 21:53:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:09.866406 | orchestrator | 2025-09-23 21:53:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:09.867908 | orchestrator | 2025-09-23 21:53:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:09.868006 | orchestrator | 2025-09-23 21:53:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:12.913041 | orchestrator | 2025-09-23 21:53:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:12.914993 | orchestrator | 2025-09-23 21:53:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:12.915029 | orchestrator | 2025-09-23 21:53:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:15.960534 | orchestrator | 2025-09-23 21:53:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:15.962811 | orchestrator | 2025-09-23 21:53:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:15.962891 | orchestrator | 2025-09-23 21:53:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:19.004231 | orchestrator | 2025-09-23 21:53:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:19.006261 | orchestrator | 2025-09-23 21:53:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:19.006290 | orchestrator | 2025-09-23 21:53:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:22.047377 | orchestrator | 2025-09-23 21:53:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:22.049451 | orchestrator | 2025-09-23 21:53:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:22.049498 | orchestrator | 2025-09-23 21:53:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:25.100092 | orchestrator | 2025-09-23 21:53:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:25.102261 | orchestrator | 2025-09-23 21:53:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:25.102338 | orchestrator | 2025-09-23 21:53:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:28.146094 | orchestrator | 2025-09-23 21:53:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:28.148167 | orchestrator | 2025-09-23 21:53:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:28.148206 | orchestrator | 2025-09-23 21:53:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:31.193547 | orchestrator | 2025-09-23 21:53:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:31.194690 | orchestrator | 2025-09-23 21:53:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:31.194720 | orchestrator | 2025-09-23 21:53:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:34.241119 | orchestrator | 2025-09-23 21:53:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:34.241782 | orchestrator | 2025-09-23 21:53:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:34.242108 | orchestrator | 2025-09-23 21:53:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:37.288262 | orchestrator | 2025-09-23 21:53:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:37.289359 | orchestrator | 2025-09-23 21:53:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:37.289709 | orchestrator | 2025-09-23 21:53:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:40.332384 | orchestrator | 2025-09-23 21:53:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:40.332504 | orchestrator | 2025-09-23 21:53:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:40.332512 | orchestrator | 2025-09-23 21:53:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:43.378091 | orchestrator | 2025-09-23 21:53:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:43.380052 | orchestrator | 2025-09-23 21:53:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:43.380120 | orchestrator | 2025-09-23 21:53:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:46.421991 | orchestrator | 2025-09-23 21:53:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:46.424211 | orchestrator | 2025-09-23 21:53:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:46.424250 | orchestrator | 2025-09-23 21:53:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:49.463841 | orchestrator | 2025-09-23 21:53:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:49.465074 | orchestrator | 2025-09-23 21:53:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:49.465104 | orchestrator | 2025-09-23 21:53:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:52.510310 | orchestrator | 2025-09-23 21:53:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:52.511514 | orchestrator | 2025-09-23 21:53:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:52.511584 | orchestrator | 2025-09-23 21:53:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:55.558250 | orchestrator | 2025-09-23 21:53:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:55.559250 | orchestrator | 2025-09-23 21:53:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:55.559297 | orchestrator | 2025-09-23 21:53:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:53:58.601429 | orchestrator | 2025-09-23 21:53:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:53:58.603742 | orchestrator | 2025-09-23 21:53:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:53:58.603839 | orchestrator | 2025-09-23 21:53:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:01.645228 | orchestrator | 2025-09-23 21:54:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:01.647613 | orchestrator | 2025-09-23 21:54:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:01.647648 | orchestrator | 2025-09-23 21:54:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:04.695865 | orchestrator | 2025-09-23 21:54:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:04.698203 | orchestrator | 2025-09-23 21:54:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:04.698237 | orchestrator | 2025-09-23 21:54:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:07.745791 | orchestrator | 2025-09-23 21:54:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:07.747429 | orchestrator | 2025-09-23 21:54:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:07.747520 | orchestrator | 2025-09-23 21:54:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:10.795010 | orchestrator | 2025-09-23 21:54:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:10.796449 | orchestrator | 2025-09-23 21:54:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:10.796663 | orchestrator | 2025-09-23 21:54:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:13.843390 | orchestrator | 2025-09-23 21:54:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:13.845072 | orchestrator | 2025-09-23 21:54:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:13.845105 | orchestrator | 2025-09-23 21:54:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:16.885861 | orchestrator | 2025-09-23 21:54:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:16.888015 | orchestrator | 2025-09-23 21:54:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:16.888046 | orchestrator | 2025-09-23 21:54:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:19.930474 | orchestrator | 2025-09-23 21:54:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:19.931963 | orchestrator | 2025-09-23 21:54:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:19.932054 | orchestrator | 2025-09-23 21:54:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:22.978093 | orchestrator | 2025-09-23 21:54:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:22.978811 | orchestrator | 2025-09-23 21:54:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:22.978842 | orchestrator | 2025-09-23 21:54:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:26.019709 | orchestrator | 2025-09-23 21:54:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:26.020598 | orchestrator | 2025-09-23 21:54:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:26.020756 | orchestrator | 2025-09-23 21:54:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:29.064419 | orchestrator | 2025-09-23 21:54:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:29.065363 | orchestrator | 2025-09-23 21:54:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:29.065405 | orchestrator | 2025-09-23 21:54:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:32.108407 | orchestrator | 2025-09-23 21:54:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:32.110193 | orchestrator | 2025-09-23 21:54:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:32.110335 | orchestrator | 2025-09-23 21:54:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:35.154652 | orchestrator | 2025-09-23 21:54:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:35.156333 | orchestrator | 2025-09-23 21:54:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:35.156375 | orchestrator | 2025-09-23 21:54:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:38.198321 | orchestrator | 2025-09-23 21:54:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:38.199430 | orchestrator | 2025-09-23 21:54:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:38.199476 | orchestrator | 2025-09-23 21:54:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:41.244219 | orchestrator | 2025-09-23 21:54:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:41.246663 | orchestrator | 2025-09-23 21:54:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:41.246714 | orchestrator | 2025-09-23 21:54:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:44.294133 | orchestrator | 2025-09-23 21:54:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:44.295623 | orchestrator | 2025-09-23 21:54:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:44.295685 | orchestrator | 2025-09-23 21:54:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:47.344046 | orchestrator | 2025-09-23 21:54:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:47.345604 | orchestrator | 2025-09-23 21:54:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:47.345634 | orchestrator | 2025-09-23 21:54:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:50.391104 | orchestrator | 2025-09-23 21:54:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:50.392754 | orchestrator | 2025-09-23 21:54:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:50.392784 | orchestrator | 2025-09-23 21:54:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:53.439257 | orchestrator | 2025-09-23 21:54:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:53.440673 | orchestrator | 2025-09-23 21:54:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:53.440765 | orchestrator | 2025-09-23 21:54:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:56.484221 | orchestrator | 2025-09-23 21:54:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:56.485626 | orchestrator | 2025-09-23 21:54:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:56.485708 | orchestrator | 2025-09-23 21:54:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:54:59.536804 | orchestrator | 2025-09-23 21:54:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:54:59.538294 | orchestrator | 2025-09-23 21:54:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:54:59.538324 | orchestrator | 2025-09-23 21:54:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:02.584263 | orchestrator | 2025-09-23 21:55:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:02.586293 | orchestrator | 2025-09-23 21:55:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:02.586371 | orchestrator | 2025-09-23 21:55:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:05.630576 | orchestrator | 2025-09-23 21:55:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:05.632950 | orchestrator | 2025-09-23 21:55:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:05.632975 | orchestrator | 2025-09-23 21:55:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:08.678442 | orchestrator | 2025-09-23 21:55:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:08.679540 | orchestrator | 2025-09-23 21:55:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:08.679938 | orchestrator | 2025-09-23 21:55:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:11.727205 | orchestrator | 2025-09-23 21:55:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:11.728773 | orchestrator | 2025-09-23 21:55:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:11.729071 | orchestrator | 2025-09-23 21:55:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:14.775986 | orchestrator | 2025-09-23 21:55:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:14.777702 | orchestrator | 2025-09-23 21:55:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:14.777874 | orchestrator | 2025-09-23 21:55:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:17.820357 | orchestrator | 2025-09-23 21:55:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:17.822555 | orchestrator | 2025-09-23 21:55:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:17.822615 | orchestrator | 2025-09-23 21:55:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:20.867943 | orchestrator | 2025-09-23 21:55:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:20.870294 | orchestrator | 2025-09-23 21:55:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:20.870422 | orchestrator | 2025-09-23 21:55:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:23.919254 | orchestrator | 2025-09-23 21:55:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:23.920873 | orchestrator | 2025-09-23 21:55:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:23.920951 | orchestrator | 2025-09-23 21:55:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:26.957253 | orchestrator | 2025-09-23 21:55:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:26.959307 | orchestrator | 2025-09-23 21:55:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:26.959611 | orchestrator | 2025-09-23 21:55:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:30.007669 | orchestrator | 2025-09-23 21:55:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:30.007883 | orchestrator | 2025-09-23 21:55:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:30.007908 | orchestrator | 2025-09-23 21:55:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:33.046833 | orchestrator | 2025-09-23 21:55:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:33.048107 | orchestrator | 2025-09-23 21:55:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:33.048159 | orchestrator | 2025-09-23 21:55:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:36.092568 | orchestrator | 2025-09-23 21:55:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:36.094398 | orchestrator | 2025-09-23 21:55:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:36.094687 | orchestrator | 2025-09-23 21:55:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:39.140264 | orchestrator | 2025-09-23 21:55:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:39.142433 | orchestrator | 2025-09-23 21:55:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:39.142548 | orchestrator | 2025-09-23 21:55:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:42.189875 | orchestrator | 2025-09-23 21:55:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:42.191467 | orchestrator | 2025-09-23 21:55:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:42.191556 | orchestrator | 2025-09-23 21:55:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:45.240444 | orchestrator | 2025-09-23 21:55:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:45.242108 | orchestrator | 2025-09-23 21:55:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:45.242238 | orchestrator | 2025-09-23 21:55:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:48.282880 | orchestrator | 2025-09-23 21:55:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:48.283968 | orchestrator | 2025-09-23 21:55:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:48.284004 | orchestrator | 2025-09-23 21:55:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:51.327107 | orchestrator | 2025-09-23 21:55:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:51.328933 | orchestrator | 2025-09-23 21:55:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:51.328963 | orchestrator | 2025-09-23 21:55:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:54.371955 | orchestrator | 2025-09-23 21:55:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:54.373339 | orchestrator | 2025-09-23 21:55:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:54.373369 | orchestrator | 2025-09-23 21:55:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:55:57.416384 | orchestrator | 2025-09-23 21:55:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:55:57.418258 | orchestrator | 2025-09-23 21:55:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:55:57.418296 | orchestrator | 2025-09-23 21:55:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:00.469063 | orchestrator | 2025-09-23 21:56:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:00.471461 | orchestrator | 2025-09-23 21:56:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:00.471533 | orchestrator | 2025-09-23 21:56:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:03.518151 | orchestrator | 2025-09-23 21:56:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:03.520677 | orchestrator | 2025-09-23 21:56:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:03.520836 | orchestrator | 2025-09-23 21:56:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:06.556974 | orchestrator | 2025-09-23 21:56:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:06.558962 | orchestrator | 2025-09-23 21:56:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:06.559005 | orchestrator | 2025-09-23 21:56:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:09.599067 | orchestrator | 2025-09-23 21:56:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:09.599588 | orchestrator | 2025-09-23 21:56:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:09.599618 | orchestrator | 2025-09-23 21:56:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:12.643846 | orchestrator | 2025-09-23 21:56:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:12.645746 | orchestrator | 2025-09-23 21:56:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:12.645892 | orchestrator | 2025-09-23 21:56:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:15.689124 | orchestrator | 2025-09-23 21:56:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:15.690624 | orchestrator | 2025-09-23 21:56:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:15.690665 | orchestrator | 2025-09-23 21:56:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:18.738448 | orchestrator | 2025-09-23 21:56:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:18.740586 | orchestrator | 2025-09-23 21:56:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:18.740664 | orchestrator | 2025-09-23 21:56:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:21.784443 | orchestrator | 2025-09-23 21:56:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:21.785887 | orchestrator | 2025-09-23 21:56:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:21.785925 | orchestrator | 2025-09-23 21:56:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:24.828202 | orchestrator | 2025-09-23 21:56:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:24.829541 | orchestrator | 2025-09-23 21:56:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:24.829572 | orchestrator | 2025-09-23 21:56:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:27.872844 | orchestrator | 2025-09-23 21:56:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:27.873634 | orchestrator | 2025-09-23 21:56:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:27.873664 | orchestrator | 2025-09-23 21:56:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:30.930196 | orchestrator | 2025-09-23 21:56:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:30.930734 | orchestrator | 2025-09-23 21:56:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:30.931069 | orchestrator | 2025-09-23 21:56:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:33.976636 | orchestrator | 2025-09-23 21:56:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:33.978425 | orchestrator | 2025-09-23 21:56:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:33.978462 | orchestrator | 2025-09-23 21:56:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:37.029888 | orchestrator | 2025-09-23 21:56:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:37.031256 | orchestrator | 2025-09-23 21:56:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:37.031288 | orchestrator | 2025-09-23 21:56:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:40.073746 | orchestrator | 2025-09-23 21:56:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:40.075226 | orchestrator | 2025-09-23 21:56:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:40.075261 | orchestrator | 2025-09-23 21:56:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:43.121082 | orchestrator | 2025-09-23 21:56:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:43.123171 | orchestrator | 2025-09-23 21:56:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:43.123275 | orchestrator | 2025-09-23 21:56:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:46.170241 | orchestrator | 2025-09-23 21:56:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:46.171315 | orchestrator | 2025-09-23 21:56:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:46.171395 | orchestrator | 2025-09-23 21:56:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:49.214286 | orchestrator | 2025-09-23 21:56:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:49.216215 | orchestrator | 2025-09-23 21:56:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:49.216246 | orchestrator | 2025-09-23 21:56:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:52.260168 | orchestrator | 2025-09-23 21:56:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:52.261905 | orchestrator | 2025-09-23 21:56:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:52.261935 | orchestrator | 2025-09-23 21:56:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:55.309517 | orchestrator | 2025-09-23 21:56:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:55.311855 | orchestrator | 2025-09-23 21:56:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:55.311932 | orchestrator | 2025-09-23 21:56:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:56:58.348939 | orchestrator | 2025-09-23 21:56:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:56:58.350616 | orchestrator | 2025-09-23 21:56:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:56:58.350656 | orchestrator | 2025-09-23 21:56:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:01.398231 | orchestrator | 2025-09-23 21:57:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:01.400186 | orchestrator | 2025-09-23 21:57:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:01.400216 | orchestrator | 2025-09-23 21:57:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:04.449131 | orchestrator | 2025-09-23 21:57:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:04.450600 | orchestrator | 2025-09-23 21:57:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:04.450925 | orchestrator | 2025-09-23 21:57:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:07.499026 | orchestrator | 2025-09-23 21:57:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:07.501816 | orchestrator | 2025-09-23 21:57:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:07.501853 | orchestrator | 2025-09-23 21:57:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:10.547943 | orchestrator | 2025-09-23 21:57:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:10.550693 | orchestrator | 2025-09-23 21:57:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:10.550748 | orchestrator | 2025-09-23 21:57:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:13.600401 | orchestrator | 2025-09-23 21:57:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:13.603961 | orchestrator | 2025-09-23 21:57:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:13.604053 | orchestrator | 2025-09-23 21:57:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:16.655899 | orchestrator | 2025-09-23 21:57:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:16.658193 | orchestrator | 2025-09-23 21:57:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:16.658329 | orchestrator | 2025-09-23 21:57:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:19.703226 | orchestrator | 2025-09-23 21:57:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:19.706103 | orchestrator | 2025-09-23 21:57:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:19.706130 | orchestrator | 2025-09-23 21:57:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:22.747797 | orchestrator | 2025-09-23 21:57:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:22.749054 | orchestrator | 2025-09-23 21:57:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:22.749077 | orchestrator | 2025-09-23 21:57:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:25.789674 | orchestrator | 2025-09-23 21:57:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:25.791336 | orchestrator | 2025-09-23 21:57:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:25.791436 | orchestrator | 2025-09-23 21:57:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:28.833985 | orchestrator | 2025-09-23 21:57:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:28.836306 | orchestrator | 2025-09-23 21:57:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:28.836720 | orchestrator | 2025-09-23 21:57:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:31.888014 | orchestrator | 2025-09-23 21:57:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:31.889511 | orchestrator | 2025-09-23 21:57:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:31.889598 | orchestrator | 2025-09-23 21:57:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:34.934813 | orchestrator | 2025-09-23 21:57:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:34.936545 | orchestrator | 2025-09-23 21:57:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:34.936857 | orchestrator | 2025-09-23 21:57:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:37.980292 | orchestrator | 2025-09-23 21:57:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:37.981611 | orchestrator | 2025-09-23 21:57:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:37.981642 | orchestrator | 2025-09-23 21:57:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:41.024442 | orchestrator | 2025-09-23 21:57:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:41.025691 | orchestrator | 2025-09-23 21:57:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:41.025793 | orchestrator | 2025-09-23 21:57:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:44.069126 | orchestrator | 2025-09-23 21:57:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:44.071136 | orchestrator | 2025-09-23 21:57:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:44.071246 | orchestrator | 2025-09-23 21:57:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:47.118179 | orchestrator | 2025-09-23 21:57:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:47.119942 | orchestrator | 2025-09-23 21:57:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:47.119994 | orchestrator | 2025-09-23 21:57:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:50.164728 | orchestrator | 2025-09-23 21:57:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:50.165694 | orchestrator | 2025-09-23 21:57:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:50.165720 | orchestrator | 2025-09-23 21:57:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:53.209786 | orchestrator | 2025-09-23 21:57:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:53.211749 | orchestrator | 2025-09-23 21:57:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:53.211785 | orchestrator | 2025-09-23 21:57:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:56.253382 | orchestrator | 2025-09-23 21:57:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:56.256667 | orchestrator | 2025-09-23 21:57:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:56.256701 | orchestrator | 2025-09-23 21:57:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:57:59.300343 | orchestrator | 2025-09-23 21:57:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:57:59.302119 | orchestrator | 2025-09-23 21:57:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:57:59.302175 | orchestrator | 2025-09-23 21:57:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:02.359956 | orchestrator | 2025-09-23 21:58:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:02.361197 | orchestrator | 2025-09-23 21:58:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:02.361252 | orchestrator | 2025-09-23 21:58:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:05.404706 | orchestrator | 2025-09-23 21:58:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:05.406892 | orchestrator | 2025-09-23 21:58:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:05.407302 | orchestrator | 2025-09-23 21:58:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:08.452408 | orchestrator | 2025-09-23 21:58:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:08.454539 | orchestrator | 2025-09-23 21:58:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:08.454575 | orchestrator | 2025-09-23 21:58:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:11.497346 | orchestrator | 2025-09-23 21:58:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:11.498759 | orchestrator | 2025-09-23 21:58:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:11.498792 | orchestrator | 2025-09-23 21:58:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:14.544567 | orchestrator | 2025-09-23 21:58:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:14.546330 | orchestrator | 2025-09-23 21:58:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:14.546415 | orchestrator | 2025-09-23 21:58:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:17.591305 | orchestrator | 2025-09-23 21:58:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:17.592363 | orchestrator | 2025-09-23 21:58:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:17.592620 | orchestrator | 2025-09-23 21:58:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:20.629785 | orchestrator | 2025-09-23 21:58:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:20.631816 | orchestrator | 2025-09-23 21:58:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:20.631911 | orchestrator | 2025-09-23 21:58:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:23.672758 | orchestrator | 2025-09-23 21:58:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:23.673993 | orchestrator | 2025-09-23 21:58:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:23.674099 | orchestrator | 2025-09-23 21:58:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:26.717280 | orchestrator | 2025-09-23 21:58:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:26.718728 | orchestrator | 2025-09-23 21:58:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:26.718825 | orchestrator | 2025-09-23 21:58:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:29.763695 | orchestrator | 2025-09-23 21:58:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:29.765206 | orchestrator | 2025-09-23 21:58:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:29.765244 | orchestrator | 2025-09-23 21:58:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:32.815355 | orchestrator | 2025-09-23 21:58:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:32.817481 | orchestrator | 2025-09-23 21:58:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:32.817829 | orchestrator | 2025-09-23 21:58:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:35.854225 | orchestrator | 2025-09-23 21:58:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:35.855778 | orchestrator | 2025-09-23 21:58:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:35.855837 | orchestrator | 2025-09-23 21:58:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:38.901182 | orchestrator | 2025-09-23 21:58:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:38.901386 | orchestrator | 2025-09-23 21:58:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:38.901408 | orchestrator | 2025-09-23 21:58:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:41.952911 | orchestrator | 2025-09-23 21:58:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:41.954517 | orchestrator | 2025-09-23 21:58:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:41.954640 | orchestrator | 2025-09-23 21:58:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:45.006949 | orchestrator | 2025-09-23 21:58:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:45.008514 | orchestrator | 2025-09-23 21:58:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:45.008543 | orchestrator | 2025-09-23 21:58:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:48.051166 | orchestrator | 2025-09-23 21:58:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:48.054397 | orchestrator | 2025-09-23 21:58:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:48.054477 | orchestrator | 2025-09-23 21:58:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:51.103532 | orchestrator | 2025-09-23 21:58:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:51.104903 | orchestrator | 2025-09-23 21:58:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:51.104986 | orchestrator | 2025-09-23 21:58:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:54.148329 | orchestrator | 2025-09-23 21:58:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:54.149903 | orchestrator | 2025-09-23 21:58:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:54.150135 | orchestrator | 2025-09-23 21:58:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:58:57.194293 | orchestrator | 2025-09-23 21:58:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:58:57.196013 | orchestrator | 2025-09-23 21:58:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:58:57.196156 | orchestrator | 2025-09-23 21:58:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:00.243119 | orchestrator | 2025-09-23 21:59:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:00.244896 | orchestrator | 2025-09-23 21:59:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:00.244964 | orchestrator | 2025-09-23 21:59:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:03.286170 | orchestrator | 2025-09-23 21:59:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:03.287990 | orchestrator | 2025-09-23 21:59:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:03.288038 | orchestrator | 2025-09-23 21:59:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:06.337479 | orchestrator | 2025-09-23 21:59:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:06.338475 | orchestrator | 2025-09-23 21:59:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:06.338568 | orchestrator | 2025-09-23 21:59:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:09.381980 | orchestrator | 2025-09-23 21:59:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:09.383453 | orchestrator | 2025-09-23 21:59:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:09.383736 | orchestrator | 2025-09-23 21:59:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:12.429247 | orchestrator | 2025-09-23 21:59:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:12.430547 | orchestrator | 2025-09-23 21:59:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:12.430585 | orchestrator | 2025-09-23 21:59:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:15.477053 | orchestrator | 2025-09-23 21:59:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:15.479736 | orchestrator | 2025-09-23 21:59:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:15.479809 | orchestrator | 2025-09-23 21:59:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:18.525836 | orchestrator | 2025-09-23 21:59:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:18.526707 | orchestrator | 2025-09-23 21:59:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:18.526965 | orchestrator | 2025-09-23 21:59:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:21.571019 | orchestrator | 2025-09-23 21:59:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:21.573157 | orchestrator | 2025-09-23 21:59:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:21.573502 | orchestrator | 2025-09-23 21:59:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:24.617652 | orchestrator | 2025-09-23 21:59:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:24.620384 | orchestrator | 2025-09-23 21:59:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:24.620455 | orchestrator | 2025-09-23 21:59:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:27.669496 | orchestrator | 2025-09-23 21:59:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:27.671461 | orchestrator | 2025-09-23 21:59:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:27.671669 | orchestrator | 2025-09-23 21:59:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:30.718318 | orchestrator | 2025-09-23 21:59:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:30.721905 | orchestrator | 2025-09-23 21:59:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:30.722093 | orchestrator | 2025-09-23 21:59:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:33.767171 | orchestrator | 2025-09-23 21:59:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:33.768675 | orchestrator | 2025-09-23 21:59:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:33.768778 | orchestrator | 2025-09-23 21:59:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:36.829072 | orchestrator | 2025-09-23 21:59:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:36.829169 | orchestrator | 2025-09-23 21:59:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:36.829184 | orchestrator | 2025-09-23 21:59:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:39.875716 | orchestrator | 2025-09-23 21:59:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:39.877641 | orchestrator | 2025-09-23 21:59:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:39.877717 | orchestrator | 2025-09-23 21:59:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:42.923138 | orchestrator | 2025-09-23 21:59:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:42.925013 | orchestrator | 2025-09-23 21:59:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:42.925036 | orchestrator | 2025-09-23 21:59:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:45.970258 | orchestrator | 2025-09-23 21:59:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:45.971965 | orchestrator | 2025-09-23 21:59:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:45.972053 | orchestrator | 2025-09-23 21:59:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:49.016010 | orchestrator | 2025-09-23 21:59:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:49.017235 | orchestrator | 2025-09-23 21:59:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:49.017320 | orchestrator | 2025-09-23 21:59:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:52.065477 | orchestrator | 2025-09-23 21:59:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:52.067252 | orchestrator | 2025-09-23 21:59:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:52.067366 | orchestrator | 2025-09-23 21:59:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:55.111930 | orchestrator | 2025-09-23 21:59:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:55.114222 | orchestrator | 2025-09-23 21:59:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:55.114252 | orchestrator | 2025-09-23 21:59:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 21:59:58.161163 | orchestrator | 2025-09-23 21:59:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 21:59:58.162723 | orchestrator | 2025-09-23 21:59:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 21:59:58.162988 | orchestrator | 2025-09-23 21:59:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:01.209740 | orchestrator | 2025-09-23 22:00:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:01.211390 | orchestrator | 2025-09-23 22:00:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:01.211517 | orchestrator | 2025-09-23 22:00:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:04.255744 | orchestrator | 2025-09-23 22:00:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:04.256497 | orchestrator | 2025-09-23 22:00:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:04.256531 | orchestrator | 2025-09-23 22:00:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:07.297669 | orchestrator | 2025-09-23 22:00:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:07.298455 | orchestrator | 2025-09-23 22:00:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:07.298486 | orchestrator | 2025-09-23 22:00:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:10.334734 | orchestrator | 2025-09-23 22:00:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:10.334835 | orchestrator | 2025-09-23 22:00:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:10.334847 | orchestrator | 2025-09-23 22:00:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:13.374478 | orchestrator | 2025-09-23 22:00:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:13.376148 | orchestrator | 2025-09-23 22:00:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:13.376475 | orchestrator | 2025-09-23 22:00:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:16.428167 | orchestrator | 2025-09-23 22:00:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:16.430067 | orchestrator | 2025-09-23 22:00:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:16.430555 | orchestrator | 2025-09-23 22:00:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:19.479616 | orchestrator | 2025-09-23 22:00:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:19.481093 | orchestrator | 2025-09-23 22:00:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:19.481164 | orchestrator | 2025-09-23 22:00:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:22.530181 | orchestrator | 2025-09-23 22:00:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:22.532546 | orchestrator | 2025-09-23 22:00:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:22.532583 | orchestrator | 2025-09-23 22:00:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:25.581046 | orchestrator | 2025-09-23 22:00:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:25.582326 | orchestrator | 2025-09-23 22:00:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:25.582466 | orchestrator | 2025-09-23 22:00:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:28.629089 | orchestrator | 2025-09-23 22:00:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:28.631246 | orchestrator | 2025-09-23 22:00:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:28.631280 | orchestrator | 2025-09-23 22:00:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:31.679073 | orchestrator | 2025-09-23 22:00:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:31.680473 | orchestrator | 2025-09-23 22:00:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:31.680601 | orchestrator | 2025-09-23 22:00:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:34.724973 | orchestrator | 2025-09-23 22:00:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:34.725838 | orchestrator | 2025-09-23 22:00:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:34.725871 | orchestrator | 2025-09-23 22:00:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:37.774389 | orchestrator | 2025-09-23 22:00:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:37.775512 | orchestrator | 2025-09-23 22:00:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:37.775628 | orchestrator | 2025-09-23 22:00:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:40.816897 | orchestrator | 2025-09-23 22:00:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:40.817803 | orchestrator | 2025-09-23 22:00:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:40.818088 | orchestrator | 2025-09-23 22:00:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:43.859861 | orchestrator | 2025-09-23 22:00:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:43.861342 | orchestrator | 2025-09-23 22:00:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:43.861571 | orchestrator | 2025-09-23 22:00:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:46.909885 | orchestrator | 2025-09-23 22:00:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:46.911977 | orchestrator | 2025-09-23 22:00:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:46.912009 | orchestrator | 2025-09-23 22:00:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:49.955673 | orchestrator | 2025-09-23 22:00:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:49.956794 | orchestrator | 2025-09-23 22:00:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:49.956823 | orchestrator | 2025-09-23 22:00:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:53.002900 | orchestrator | 2025-09-23 22:00:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:53.003899 | orchestrator | 2025-09-23 22:00:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:53.004204 | orchestrator | 2025-09-23 22:00:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:56.052476 | orchestrator | 2025-09-23 22:00:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:56.052732 | orchestrator | 2025-09-23 22:00:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:56.053198 | orchestrator | 2025-09-23 22:00:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:00:59.092803 | orchestrator | 2025-09-23 22:00:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:00:59.094221 | orchestrator | 2025-09-23 22:00:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:00:59.094251 | orchestrator | 2025-09-23 22:00:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:02.134271 | orchestrator | 2025-09-23 22:01:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:02.135811 | orchestrator | 2025-09-23 22:01:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:02.135840 | orchestrator | 2025-09-23 22:01:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:05.181854 | orchestrator | 2025-09-23 22:01:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:05.182849 | orchestrator | 2025-09-23 22:01:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:05.183070 | orchestrator | 2025-09-23 22:01:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:08.229627 | orchestrator | 2025-09-23 22:01:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:08.232697 | orchestrator | 2025-09-23 22:01:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:08.232862 | orchestrator | 2025-09-23 22:01:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:11.279101 | orchestrator | 2025-09-23 22:01:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:11.280490 | orchestrator | 2025-09-23 22:01:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:11.280524 | orchestrator | 2025-09-23 22:01:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:14.324808 | orchestrator | 2025-09-23 22:01:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:14.326610 | orchestrator | 2025-09-23 22:01:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:14.326631 | orchestrator | 2025-09-23 22:01:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:17.371963 | orchestrator | 2025-09-23 22:01:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:17.373198 | orchestrator | 2025-09-23 22:01:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:17.373474 | orchestrator | 2025-09-23 22:01:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:20.417058 | orchestrator | 2025-09-23 22:01:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:20.418185 | orchestrator | 2025-09-23 22:01:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:20.418229 | orchestrator | 2025-09-23 22:01:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:23.463563 | orchestrator | 2025-09-23 22:01:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:23.464677 | orchestrator | 2025-09-23 22:01:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:23.464710 | orchestrator | 2025-09-23 22:01:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:26.519258 | orchestrator | 2025-09-23 22:01:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:26.520773 | orchestrator | 2025-09-23 22:01:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:26.521012 | orchestrator | 2025-09-23 22:01:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:29.563559 | orchestrator | 2025-09-23 22:01:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:29.566671 | orchestrator | 2025-09-23 22:01:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:29.566749 | orchestrator | 2025-09-23 22:01:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:32.609929 | orchestrator | 2025-09-23 22:01:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:32.610817 | orchestrator | 2025-09-23 22:01:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:32.610976 | orchestrator | 2025-09-23 22:01:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:35.659198 | orchestrator | 2025-09-23 22:01:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:35.660586 | orchestrator | 2025-09-23 22:01:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:35.660623 | orchestrator | 2025-09-23 22:01:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:38.703946 | orchestrator | 2025-09-23 22:01:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:38.705829 | orchestrator | 2025-09-23 22:01:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:38.706074 | orchestrator | 2025-09-23 22:01:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:41.754851 | orchestrator | 2025-09-23 22:01:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:41.756925 | orchestrator | 2025-09-23 22:01:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:41.757075 | orchestrator | 2025-09-23 22:01:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:44.801806 | orchestrator | 2025-09-23 22:01:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:44.803929 | orchestrator | 2025-09-23 22:01:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:44.803957 | orchestrator | 2025-09-23 22:01:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:47.845587 | orchestrator | 2025-09-23 22:01:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:47.846502 | orchestrator | 2025-09-23 22:01:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:47.846552 | orchestrator | 2025-09-23 22:01:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:50.892099 | orchestrator | 2025-09-23 22:01:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:50.894139 | orchestrator | 2025-09-23 22:01:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:50.894370 | orchestrator | 2025-09-23 22:01:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:53.937226 | orchestrator | 2025-09-23 22:01:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:53.938453 | orchestrator | 2025-09-23 22:01:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:53.938487 | orchestrator | 2025-09-23 22:01:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:01:56.985437 | orchestrator | 2025-09-23 22:01:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:01:56.987987 | orchestrator | 2025-09-23 22:01:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:01:56.988030 | orchestrator | 2025-09-23 22:01:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:00.036996 | orchestrator | 2025-09-23 22:02:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:00.038270 | orchestrator | 2025-09-23 22:02:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:00.038306 | orchestrator | 2025-09-23 22:02:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:03.083104 | orchestrator | 2025-09-23 22:02:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:03.084201 | orchestrator | 2025-09-23 22:02:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:03.084466 | orchestrator | 2025-09-23 22:02:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:06.129247 | orchestrator | 2025-09-23 22:02:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:06.130803 | orchestrator | 2025-09-23 22:02:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:06.130842 | orchestrator | 2025-09-23 22:02:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:09.174750 | orchestrator | 2025-09-23 22:02:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:09.176358 | orchestrator | 2025-09-23 22:02:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:09.176467 | orchestrator | 2025-09-23 22:02:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:12.233225 | orchestrator | 2025-09-23 22:02:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:12.234194 | orchestrator | 2025-09-23 22:02:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:12.234423 | orchestrator | 2025-09-23 22:02:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:15.278529 | orchestrator | 2025-09-23 22:02:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:15.280660 | orchestrator | 2025-09-23 22:02:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:15.280704 | orchestrator | 2025-09-23 22:02:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:18.323994 | orchestrator | 2025-09-23 22:02:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:18.325877 | orchestrator | 2025-09-23 22:02:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:18.325921 | orchestrator | 2025-09-23 22:02:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:21.372685 | orchestrator | 2025-09-23 22:02:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:21.373548 | orchestrator | 2025-09-23 22:02:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:21.373575 | orchestrator | 2025-09-23 22:02:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:24.420092 | orchestrator | 2025-09-23 22:02:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:24.423136 | orchestrator | 2025-09-23 22:02:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:24.423215 | orchestrator | 2025-09-23 22:02:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:27.471940 | orchestrator | 2025-09-23 22:02:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:27.473204 | orchestrator | 2025-09-23 22:02:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:27.473235 | orchestrator | 2025-09-23 22:02:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:30.510252 | orchestrator | 2025-09-23 22:02:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:30.511440 | orchestrator | 2025-09-23 22:02:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:30.511471 | orchestrator | 2025-09-23 22:02:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:33.553206 | orchestrator | 2025-09-23 22:02:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:33.554472 | orchestrator | 2025-09-23 22:02:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:33.554511 | orchestrator | 2025-09-23 22:02:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:36.598544 | orchestrator | 2025-09-23 22:02:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:36.599481 | orchestrator | 2025-09-23 22:02:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:36.599550 | orchestrator | 2025-09-23 22:02:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:39.653034 | orchestrator | 2025-09-23 22:02:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:39.654418 | orchestrator | 2025-09-23 22:02:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:39.655165 | orchestrator | 2025-09-23 22:02:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:42.702976 | orchestrator | 2025-09-23 22:02:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:42.704902 | orchestrator | 2025-09-23 22:02:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:42.704935 | orchestrator | 2025-09-23 22:02:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:45.756733 | orchestrator | 2025-09-23 22:02:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:45.757867 | orchestrator | 2025-09-23 22:02:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:45.757897 | orchestrator | 2025-09-23 22:02:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:48.799861 | orchestrator | 2025-09-23 22:02:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:48.802056 | orchestrator | 2025-09-23 22:02:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:48.802195 | orchestrator | 2025-09-23 22:02:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:51.846924 | orchestrator | 2025-09-23 22:02:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:51.848349 | orchestrator | 2025-09-23 22:02:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:51.848472 | orchestrator | 2025-09-23 22:02:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:54.894213 | orchestrator | 2025-09-23 22:02:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:54.894976 | orchestrator | 2025-09-23 22:02:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:54.895093 | orchestrator | 2025-09-23 22:02:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:02:57.940880 | orchestrator | 2025-09-23 22:02:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:02:57.942245 | orchestrator | 2025-09-23 22:02:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:02:57.942277 | orchestrator | 2025-09-23 22:02:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:00.991853 | orchestrator | 2025-09-23 22:03:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:00.993056 | orchestrator | 2025-09-23 22:03:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:00.993096 | orchestrator | 2025-09-23 22:03:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:04.037591 | orchestrator | 2025-09-23 22:03:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:04.037957 | orchestrator | 2025-09-23 22:03:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:04.038076 | orchestrator | 2025-09-23 22:03:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:07.079901 | orchestrator | 2025-09-23 22:03:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:07.081341 | orchestrator | 2025-09-23 22:03:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:07.081419 | orchestrator | 2025-09-23 22:03:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:10.125933 | orchestrator | 2025-09-23 22:03:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:10.127129 | orchestrator | 2025-09-23 22:03:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:10.127199 | orchestrator | 2025-09-23 22:03:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:13.170595 | orchestrator | 2025-09-23 22:03:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:13.172706 | orchestrator | 2025-09-23 22:03:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:13.172744 | orchestrator | 2025-09-23 22:03:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:16.214709 | orchestrator | 2025-09-23 22:03:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:16.215576 | orchestrator | 2025-09-23 22:03:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:16.215606 | orchestrator | 2025-09-23 22:03:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:19.254890 | orchestrator | 2025-09-23 22:03:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:19.257821 | orchestrator | 2025-09-23 22:03:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:19.257852 | orchestrator | 2025-09-23 22:03:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:22.305406 | orchestrator | 2025-09-23 22:03:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:22.306410 | orchestrator | 2025-09-23 22:03:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:22.306485 | orchestrator | 2025-09-23 22:03:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:25.350669 | orchestrator | 2025-09-23 22:03:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:25.353792 | orchestrator | 2025-09-23 22:03:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:25.353834 | orchestrator | 2025-09-23 22:03:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:28.401493 | orchestrator | 2025-09-23 22:03:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:28.402445 | orchestrator | 2025-09-23 22:03:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:28.402473 | orchestrator | 2025-09-23 22:03:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:31.443410 | orchestrator | 2025-09-23 22:03:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:31.444008 | orchestrator | 2025-09-23 22:03:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:31.444268 | orchestrator | 2025-09-23 22:03:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:34.488564 | orchestrator | 2025-09-23 22:03:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:34.492405 | orchestrator | 2025-09-23 22:03:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:34.492487 | orchestrator | 2025-09-23 22:03:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:37.540236 | orchestrator | 2025-09-23 22:03:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:37.541737 | orchestrator | 2025-09-23 22:03:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:37.541769 | orchestrator | 2025-09-23 22:03:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:40.586436 | orchestrator | 2025-09-23 22:03:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:40.587889 | orchestrator | 2025-09-23 22:03:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:40.587917 | orchestrator | 2025-09-23 22:03:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:43.631633 | orchestrator | 2025-09-23 22:03:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:43.633408 | orchestrator | 2025-09-23 22:03:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:43.633452 | orchestrator | 2025-09-23 22:03:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:46.677601 | orchestrator | 2025-09-23 22:03:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:46.678791 | orchestrator | 2025-09-23 22:03:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:46.678820 | orchestrator | 2025-09-23 22:03:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:49.724839 | orchestrator | 2025-09-23 22:03:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:49.726430 | orchestrator | 2025-09-23 22:03:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:49.726477 | orchestrator | 2025-09-23 22:03:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:52.771668 | orchestrator | 2025-09-23 22:03:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:52.773966 | orchestrator | 2025-09-23 22:03:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:52.774119 | orchestrator | 2025-09-23 22:03:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:55.817091 | orchestrator | 2025-09-23 22:03:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:55.818482 | orchestrator | 2025-09-23 22:03:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:55.818579 | orchestrator | 2025-09-23 22:03:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:03:58.861150 | orchestrator | 2025-09-23 22:03:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:03:58.863635 | orchestrator | 2025-09-23 22:03:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:03:58.863675 | orchestrator | 2025-09-23 22:03:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:01.906012 | orchestrator | 2025-09-23 22:04:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:01.907495 | orchestrator | 2025-09-23 22:04:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:01.907779 | orchestrator | 2025-09-23 22:04:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:04.950073 | orchestrator | 2025-09-23 22:04:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:04.951515 | orchestrator | 2025-09-23 22:04:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:04.951577 | orchestrator | 2025-09-23 22:04:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:08.000624 | orchestrator | 2025-09-23 22:04:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:08.000719 | orchestrator | 2025-09-23 22:04:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:08.000731 | orchestrator | 2025-09-23 22:04:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:11.044848 | orchestrator | 2025-09-23 22:04:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:11.045518 | orchestrator | 2025-09-23 22:04:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:11.045549 | orchestrator | 2025-09-23 22:04:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:14.094317 | orchestrator | 2025-09-23 22:04:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:14.095919 | orchestrator | 2025-09-23 22:04:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:14.096658 | orchestrator | 2025-09-23 22:04:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:17.143510 | orchestrator | 2025-09-23 22:04:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:17.146263 | orchestrator | 2025-09-23 22:04:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:17.146386 | orchestrator | 2025-09-23 22:04:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:20.192472 | orchestrator | 2025-09-23 22:04:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:20.194168 | orchestrator | 2025-09-23 22:04:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:20.194744 | orchestrator | 2025-09-23 22:04:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:23.239617 | orchestrator | 2025-09-23 22:04:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:23.240739 | orchestrator | 2025-09-23 22:04:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:23.240865 | orchestrator | 2025-09-23 22:04:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:26.281295 | orchestrator | 2025-09-23 22:04:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:26.284095 | orchestrator | 2025-09-23 22:04:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:26.284200 | orchestrator | 2025-09-23 22:04:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:29.330390 | orchestrator | 2025-09-23 22:04:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:29.331768 | orchestrator | 2025-09-23 22:04:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:29.331820 | orchestrator | 2025-09-23 22:04:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:32.375243 | orchestrator | 2025-09-23 22:04:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:32.376780 | orchestrator | 2025-09-23 22:04:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:32.376837 | orchestrator | 2025-09-23 22:04:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:35.424696 | orchestrator | 2025-09-23 22:04:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:35.426649 | orchestrator | 2025-09-23 22:04:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:35.426737 | orchestrator | 2025-09-23 22:04:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:38.466191 | orchestrator | 2025-09-23 22:04:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:38.468099 | orchestrator | 2025-09-23 22:04:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:38.468131 | orchestrator | 2025-09-23 22:04:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:41.513199 | orchestrator | 2025-09-23 22:04:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:41.515497 | orchestrator | 2025-09-23 22:04:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:41.515553 | orchestrator | 2025-09-23 22:04:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:44.554416 | orchestrator | 2025-09-23 22:04:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:44.555822 | orchestrator | 2025-09-23 22:04:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:44.555900 | orchestrator | 2025-09-23 22:04:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:47.599559 | orchestrator | 2025-09-23 22:04:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:47.601231 | orchestrator | 2025-09-23 22:04:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:47.601278 | orchestrator | 2025-09-23 22:04:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:50.647747 | orchestrator | 2025-09-23 22:04:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:50.650406 | orchestrator | 2025-09-23 22:04:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:50.650629 | orchestrator | 2025-09-23 22:04:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:53.704054 | orchestrator | 2025-09-23 22:04:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:53.705495 | orchestrator | 2025-09-23 22:04:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:53.705538 | orchestrator | 2025-09-23 22:04:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:56.753371 | orchestrator | 2025-09-23 22:04:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:56.754693 | orchestrator | 2025-09-23 22:04:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:56.754728 | orchestrator | 2025-09-23 22:04:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:04:59.797747 | orchestrator | 2025-09-23 22:04:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:04:59.799183 | orchestrator | 2025-09-23 22:04:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:04:59.799291 | orchestrator | 2025-09-23 22:04:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:02.843229 | orchestrator | 2025-09-23 22:05:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:02.844620 | orchestrator | 2025-09-23 22:05:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:02.844655 | orchestrator | 2025-09-23 22:05:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:05.885371 | orchestrator | 2025-09-23 22:05:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:05.887494 | orchestrator | 2025-09-23 22:05:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:05.887524 | orchestrator | 2025-09-23 22:05:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:08.931400 | orchestrator | 2025-09-23 22:05:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:08.933277 | orchestrator | 2025-09-23 22:05:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:08.933305 | orchestrator | 2025-09-23 22:05:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:11.980728 | orchestrator | 2025-09-23 22:05:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:11.981255 | orchestrator | 2025-09-23 22:05:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:11.981288 | orchestrator | 2025-09-23 22:05:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:15.030824 | orchestrator | 2025-09-23 22:05:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:15.031532 | orchestrator | 2025-09-23 22:05:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:15.031564 | orchestrator | 2025-09-23 22:05:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:18.077989 | orchestrator | 2025-09-23 22:05:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:18.079490 | orchestrator | 2025-09-23 22:05:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:18.079707 | orchestrator | 2025-09-23 22:05:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:21.121033 | orchestrator | 2025-09-23 22:05:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:21.121841 | orchestrator | 2025-09-23 22:05:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:21.121873 | orchestrator | 2025-09-23 22:05:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:24.166563 | orchestrator | 2025-09-23 22:05:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:24.168848 | orchestrator | 2025-09-23 22:05:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:24.169197 | orchestrator | 2025-09-23 22:05:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:27.218790 | orchestrator | 2025-09-23 22:05:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:27.220240 | orchestrator | 2025-09-23 22:05:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:27.220527 | orchestrator | 2025-09-23 22:05:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:30.267486 | orchestrator | 2025-09-23 22:05:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:30.267592 | orchestrator | 2025-09-23 22:05:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:30.267695 | orchestrator | 2025-09-23 22:05:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:33.316498 | orchestrator | 2025-09-23 22:05:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:33.317830 | orchestrator | 2025-09-23 22:05:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:33.317935 | orchestrator | 2025-09-23 22:05:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:36.362593 | orchestrator | 2025-09-23 22:05:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:36.363773 | orchestrator | 2025-09-23 22:05:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:36.363803 | orchestrator | 2025-09-23 22:05:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:39.407728 | orchestrator | 2025-09-23 22:05:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:39.410202 | orchestrator | 2025-09-23 22:05:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:39.410273 | orchestrator | 2025-09-23 22:05:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:42.447933 | orchestrator | 2025-09-23 22:05:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:42.449059 | orchestrator | 2025-09-23 22:05:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:42.449088 | orchestrator | 2025-09-23 22:05:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:45.493434 | orchestrator | 2025-09-23 22:05:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:45.494334 | orchestrator | 2025-09-23 22:05:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:45.494365 | orchestrator | 2025-09-23 22:05:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:48.549820 | orchestrator | 2025-09-23 22:05:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:48.550446 | orchestrator | 2025-09-23 22:05:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:48.550481 | orchestrator | 2025-09-23 22:05:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:51.597110 | orchestrator | 2025-09-23 22:05:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:51.598448 | orchestrator | 2025-09-23 22:05:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:51.598493 | orchestrator | 2025-09-23 22:05:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:54.646598 | orchestrator | 2025-09-23 22:05:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:54.647347 | orchestrator | 2025-09-23 22:05:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:54.647376 | orchestrator | 2025-09-23 22:05:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:05:57.692552 | orchestrator | 2025-09-23 22:05:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:05:57.693846 | orchestrator | 2025-09-23 22:05:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:05:57.693874 | orchestrator | 2025-09-23 22:05:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:00.739481 | orchestrator | 2025-09-23 22:06:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:00.741636 | orchestrator | 2025-09-23 22:06:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:00.741977 | orchestrator | 2025-09-23 22:06:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:03.784265 | orchestrator | 2025-09-23 22:06:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:03.785869 | orchestrator | 2025-09-23 22:06:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:03.786160 | orchestrator | 2025-09-23 22:06:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:06.830533 | orchestrator | 2025-09-23 22:06:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:06.832399 | orchestrator | 2025-09-23 22:06:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:06.832428 | orchestrator | 2025-09-23 22:06:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:09.872716 | orchestrator | 2025-09-23 22:06:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:09.874091 | orchestrator | 2025-09-23 22:06:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:09.874231 | orchestrator | 2025-09-23 22:06:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:12.918225 | orchestrator | 2025-09-23 22:06:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:12.919626 | orchestrator | 2025-09-23 22:06:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:12.919968 | orchestrator | 2025-09-23 22:06:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:15.964923 | orchestrator | 2025-09-23 22:06:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:15.966491 | orchestrator | 2025-09-23 22:06:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:15.966523 | orchestrator | 2025-09-23 22:06:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:19.011036 | orchestrator | 2025-09-23 22:06:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:19.012187 | orchestrator | 2025-09-23 22:06:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:19.012427 | orchestrator | 2025-09-23 22:06:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:22.052983 | orchestrator | 2025-09-23 22:06:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:22.054660 | orchestrator | 2025-09-23 22:06:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:22.054742 | orchestrator | 2025-09-23 22:06:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:25.101945 | orchestrator | 2025-09-23 22:06:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:25.105535 | orchestrator | 2025-09-23 22:06:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:25.105569 | orchestrator | 2025-09-23 22:06:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:28.151699 | orchestrator | 2025-09-23 22:06:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:28.152218 | orchestrator | 2025-09-23 22:06:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:28.152318 | orchestrator | 2025-09-23 22:06:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:31.203892 | orchestrator | 2025-09-23 22:06:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:31.205528 | orchestrator | 2025-09-23 22:06:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:31.205600 | orchestrator | 2025-09-23 22:06:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:34.250163 | orchestrator | 2025-09-23 22:06:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:34.251816 | orchestrator | 2025-09-23 22:06:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:34.251846 | orchestrator | 2025-09-23 22:06:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:37.298897 | orchestrator | 2025-09-23 22:06:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:37.301149 | orchestrator | 2025-09-23 22:06:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:37.302443 | orchestrator | 2025-09-23 22:06:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:40.349998 | orchestrator | 2025-09-23 22:06:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:40.351104 | orchestrator | 2025-09-23 22:06:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:40.351134 | orchestrator | 2025-09-23 22:06:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:43.395692 | orchestrator | 2025-09-23 22:06:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:43.396956 | orchestrator | 2025-09-23 22:06:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:43.397001 | orchestrator | 2025-09-23 22:06:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:46.441249 | orchestrator | 2025-09-23 22:06:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:46.443546 | orchestrator | 2025-09-23 22:06:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:46.443605 | orchestrator | 2025-09-23 22:06:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:49.492862 | orchestrator | 2025-09-23 22:06:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:49.494337 | orchestrator | 2025-09-23 22:06:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:49.494366 | orchestrator | 2025-09-23 22:06:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:52.537126 | orchestrator | 2025-09-23 22:06:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:52.540777 | orchestrator | 2025-09-23 22:06:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:52.541026 | orchestrator | 2025-09-23 22:06:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:55.587548 | orchestrator | 2025-09-23 22:06:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:55.589226 | orchestrator | 2025-09-23 22:06:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:55.589270 | orchestrator | 2025-09-23 22:06:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:06:58.632185 | orchestrator | 2025-09-23 22:06:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:06:58.633253 | orchestrator | 2025-09-23 22:06:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:06:58.633308 | orchestrator | 2025-09-23 22:06:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:01.683936 | orchestrator | 2025-09-23 22:07:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:01.685719 | orchestrator | 2025-09-23 22:07:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:01.685786 | orchestrator | 2025-09-23 22:07:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:04.730502 | orchestrator | 2025-09-23 22:07:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:04.732846 | orchestrator | 2025-09-23 22:07:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:04.732940 | orchestrator | 2025-09-23 22:07:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:07.777734 | orchestrator | 2025-09-23 22:07:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:07.779208 | orchestrator | 2025-09-23 22:07:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:07.779394 | orchestrator | 2025-09-23 22:07:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:10.821849 | orchestrator | 2025-09-23 22:07:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:10.823774 | orchestrator | 2025-09-23 22:07:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:10.823812 | orchestrator | 2025-09-23 22:07:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:13.867853 | orchestrator | 2025-09-23 22:07:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:13.869574 | orchestrator | 2025-09-23 22:07:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:13.869600 | orchestrator | 2025-09-23 22:07:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:16.912506 | orchestrator | 2025-09-23 22:07:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:16.914558 | orchestrator | 2025-09-23 22:07:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:16.914583 | orchestrator | 2025-09-23 22:07:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:19.955029 | orchestrator | 2025-09-23 22:07:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:19.956095 | orchestrator | 2025-09-23 22:07:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:19.956132 | orchestrator | 2025-09-23 22:07:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:23.003855 | orchestrator | 2025-09-23 22:07:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:23.005532 | orchestrator | 2025-09-23 22:07:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:23.005569 | orchestrator | 2025-09-23 22:07:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:26.050553 | orchestrator | 2025-09-23 22:07:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:26.051307 | orchestrator | 2025-09-23 22:07:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:26.051344 | orchestrator | 2025-09-23 22:07:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:29.093441 | orchestrator | 2025-09-23 22:07:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:29.094520 | orchestrator | 2025-09-23 22:07:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:29.094555 | orchestrator | 2025-09-23 22:07:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:32.140457 | orchestrator | 2025-09-23 22:07:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:32.142063 | orchestrator | 2025-09-23 22:07:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:32.142099 | orchestrator | 2025-09-23 22:07:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:35.187120 | orchestrator | 2025-09-23 22:07:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:35.187397 | orchestrator | 2025-09-23 22:07:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:35.187472 | orchestrator | 2025-09-23 22:07:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:38.230452 | orchestrator | 2025-09-23 22:07:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:38.231645 | orchestrator | 2025-09-23 22:07:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:38.231676 | orchestrator | 2025-09-23 22:07:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:41.272440 | orchestrator | 2025-09-23 22:07:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:41.277065 | orchestrator | 2025-09-23 22:07:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:41.277174 | orchestrator | 2025-09-23 22:07:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:44.311716 | orchestrator | 2025-09-23 22:07:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:44.311906 | orchestrator | 2025-09-23 22:07:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:44.311928 | orchestrator | 2025-09-23 22:07:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:47.354680 | orchestrator | 2025-09-23 22:07:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:47.356303 | orchestrator | 2025-09-23 22:07:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:47.356335 | orchestrator | 2025-09-23 22:07:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:50.398989 | orchestrator | 2025-09-23 22:07:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:50.400037 | orchestrator | 2025-09-23 22:07:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:50.400065 | orchestrator | 2025-09-23 22:07:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:53.446371 | orchestrator | 2025-09-23 22:07:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:53.446623 | orchestrator | 2025-09-23 22:07:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:53.446662 | orchestrator | 2025-09-23 22:07:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:56.497352 | orchestrator | 2025-09-23 22:07:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:56.500419 | orchestrator | 2025-09-23 22:07:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:56.500500 | orchestrator | 2025-09-23 22:07:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:07:59.546839 | orchestrator | 2025-09-23 22:07:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:07:59.548339 | orchestrator | 2025-09-23 22:07:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:07:59.548370 | orchestrator | 2025-09-23 22:07:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:02.589116 | orchestrator | 2025-09-23 22:08:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:02.591810 | orchestrator | 2025-09-23 22:08:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:02.591854 | orchestrator | 2025-09-23 22:08:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:05.640447 | orchestrator | 2025-09-23 22:08:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:05.641073 | orchestrator | 2025-09-23 22:08:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:05.641102 | orchestrator | 2025-09-23 22:08:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:08.684835 | orchestrator | 2025-09-23 22:08:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:08.685676 | orchestrator | 2025-09-23 22:08:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:08.685824 | orchestrator | 2025-09-23 22:08:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:11.736933 | orchestrator | 2025-09-23 22:08:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:11.738461 | orchestrator | 2025-09-23 22:08:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:11.738930 | orchestrator | 2025-09-23 22:08:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:14.779637 | orchestrator | 2025-09-23 22:08:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:14.781369 | orchestrator | 2025-09-23 22:08:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:14.781414 | orchestrator | 2025-09-23 22:08:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:17.827995 | orchestrator | 2025-09-23 22:08:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:17.828760 | orchestrator | 2025-09-23 22:08:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:17.828836 | orchestrator | 2025-09-23 22:08:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:20.875167 | orchestrator | 2025-09-23 22:08:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:20.876986 | orchestrator | 2025-09-23 22:08:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:20.877312 | orchestrator | 2025-09-23 22:08:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:23.923556 | orchestrator | 2025-09-23 22:08:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:23.923720 | orchestrator | 2025-09-23 22:08:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:23.923739 | orchestrator | 2025-09-23 22:08:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:26.973744 | orchestrator | 2025-09-23 22:08:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:26.976487 | orchestrator | 2025-09-23 22:08:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:26.976876 | orchestrator | 2025-09-23 22:08:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:30.023010 | orchestrator | 2025-09-23 22:08:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:30.025708 | orchestrator | 2025-09-23 22:08:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:30.025810 | orchestrator | 2025-09-23 22:08:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:33.072179 | orchestrator | 2025-09-23 22:08:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:33.073609 | orchestrator | 2025-09-23 22:08:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:33.073916 | orchestrator | 2025-09-23 22:08:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:36.121410 | orchestrator | 2025-09-23 22:08:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:36.122349 | orchestrator | 2025-09-23 22:08:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:36.122386 | orchestrator | 2025-09-23 22:08:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:39.165792 | orchestrator | 2025-09-23 22:08:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:39.166978 | orchestrator | 2025-09-23 22:08:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:39.167333 | orchestrator | 2025-09-23 22:08:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:42.213377 | orchestrator | 2025-09-23 22:08:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:42.215396 | orchestrator | 2025-09-23 22:08:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:42.215600 | orchestrator | 2025-09-23 22:08:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:45.254777 | orchestrator | 2025-09-23 22:08:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:45.256637 | orchestrator | 2025-09-23 22:08:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:45.256834 | orchestrator | 2025-09-23 22:08:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:48.297515 | orchestrator | 2025-09-23 22:08:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:48.299920 | orchestrator | 2025-09-23 22:08:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:48.300328 | orchestrator | 2025-09-23 22:08:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:51.340343 | orchestrator | 2025-09-23 22:08:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:51.340964 | orchestrator | 2025-09-23 22:08:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:51.341164 | orchestrator | 2025-09-23 22:08:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:54.385246 | orchestrator | 2025-09-23 22:08:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:54.385657 | orchestrator | 2025-09-23 22:08:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:54.385687 | orchestrator | 2025-09-23 22:08:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:08:57.432224 | orchestrator | 2025-09-23 22:08:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:08:57.434003 | orchestrator | 2025-09-23 22:08:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:08:57.434162 | orchestrator | 2025-09-23 22:08:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:00.475169 | orchestrator | 2025-09-23 22:09:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:00.475685 | orchestrator | 2025-09-23 22:09:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:00.475953 | orchestrator | 2025-09-23 22:09:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:03.520241 | orchestrator | 2025-09-23 22:09:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:03.522339 | orchestrator | 2025-09-23 22:09:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:03.522662 | orchestrator | 2025-09-23 22:09:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:06.565289 | orchestrator | 2025-09-23 22:09:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:06.566120 | orchestrator | 2025-09-23 22:09:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:06.566201 | orchestrator | 2025-09-23 22:09:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:09.613644 | orchestrator | 2025-09-23 22:09:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:09.615409 | orchestrator | 2025-09-23 22:09:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:09.615747 | orchestrator | 2025-09-23 22:09:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:12.659855 | orchestrator | 2025-09-23 22:09:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:12.661887 | orchestrator | 2025-09-23 22:09:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:12.662099 | orchestrator | 2025-09-23 22:09:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:15.711364 | orchestrator | 2025-09-23 22:09:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:15.711926 | orchestrator | 2025-09-23 22:09:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:15.711955 | orchestrator | 2025-09-23 22:09:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:18.753547 | orchestrator | 2025-09-23 22:09:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:18.754844 | orchestrator | 2025-09-23 22:09:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:18.754882 | orchestrator | 2025-09-23 22:09:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:21.798992 | orchestrator | 2025-09-23 22:09:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:21.800181 | orchestrator | 2025-09-23 22:09:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:21.800213 | orchestrator | 2025-09-23 22:09:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:24.843498 | orchestrator | 2025-09-23 22:09:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:24.845237 | orchestrator | 2025-09-23 22:09:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:24.845295 | orchestrator | 2025-09-23 22:09:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:27.894632 | orchestrator | 2025-09-23 22:09:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:27.896671 | orchestrator | 2025-09-23 22:09:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:27.896828 | orchestrator | 2025-09-23 22:09:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:30.943110 | orchestrator | 2025-09-23 22:09:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:30.944809 | orchestrator | 2025-09-23 22:09:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:30.944839 | orchestrator | 2025-09-23 22:09:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:33.985608 | orchestrator | 2025-09-23 22:09:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:33.986674 | orchestrator | 2025-09-23 22:09:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:33.986704 | orchestrator | 2025-09-23 22:09:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:37.034898 | orchestrator | 2025-09-23 22:09:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:37.036979 | orchestrator | 2025-09-23 22:09:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:37.037013 | orchestrator | 2025-09-23 22:09:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:40.083752 | orchestrator | 2025-09-23 22:09:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:40.084699 | orchestrator | 2025-09-23 22:09:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:40.084875 | orchestrator | 2025-09-23 22:09:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:43.129809 | orchestrator | 2025-09-23 22:09:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:43.131557 | orchestrator | 2025-09-23 22:09:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:43.131818 | orchestrator | 2025-09-23 22:09:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:46.179967 | orchestrator | 2025-09-23 22:09:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:46.181961 | orchestrator | 2025-09-23 22:09:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:46.182149 | orchestrator | 2025-09-23 22:09:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:49.220106 | orchestrator | 2025-09-23 22:09:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:49.221609 | orchestrator | 2025-09-23 22:09:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:49.221673 | orchestrator | 2025-09-23 22:09:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:52.266608 | orchestrator | 2025-09-23 22:09:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:52.267570 | orchestrator | 2025-09-23 22:09:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:52.267598 | orchestrator | 2025-09-23 22:09:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:55.311731 | orchestrator | 2025-09-23 22:09:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:55.312750 | orchestrator | 2025-09-23 22:09:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:55.312785 | orchestrator | 2025-09-23 22:09:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:09:58.352587 | orchestrator | 2025-09-23 22:09:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:09:58.354244 | orchestrator | 2025-09-23 22:09:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:09:58.354409 | orchestrator | 2025-09-23 22:09:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:01.404905 | orchestrator | 2025-09-23 22:10:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:01.405398 | orchestrator | 2025-09-23 22:10:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:01.405625 | orchestrator | 2025-09-23 22:10:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:04.454674 | orchestrator | 2025-09-23 22:10:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:04.455998 | orchestrator | 2025-09-23 22:10:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:04.456027 | orchestrator | 2025-09-23 22:10:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:07.503057 | orchestrator | 2025-09-23 22:10:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:07.504409 | orchestrator | 2025-09-23 22:10:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:07.504982 | orchestrator | 2025-09-23 22:10:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:10.541982 | orchestrator | 2025-09-23 22:10:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:10.544057 | orchestrator | 2025-09-23 22:10:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:10.544374 | orchestrator | 2025-09-23 22:10:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:13.588126 | orchestrator | 2025-09-23 22:10:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:13.590373 | orchestrator | 2025-09-23 22:10:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:13.590427 | orchestrator | 2025-09-23 22:10:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:16.637793 | orchestrator | 2025-09-23 22:10:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:16.639235 | orchestrator | 2025-09-23 22:10:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:16.639302 | orchestrator | 2025-09-23 22:10:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:19.685741 | orchestrator | 2025-09-23 22:10:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:19.687202 | orchestrator | 2025-09-23 22:10:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:19.687243 | orchestrator | 2025-09-23 22:10:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:22.736049 | orchestrator | 2025-09-23 22:10:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:22.737322 | orchestrator | 2025-09-23 22:10:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:22.737392 | orchestrator | 2025-09-23 22:10:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:25.781575 | orchestrator | 2025-09-23 22:10:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:25.784087 | orchestrator | 2025-09-23 22:10:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:25.784135 | orchestrator | 2025-09-23 22:10:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:28.829115 | orchestrator | 2025-09-23 22:10:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:28.830610 | orchestrator | 2025-09-23 22:10:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:28.830644 | orchestrator | 2025-09-23 22:10:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:31.878361 | orchestrator | 2025-09-23 22:10:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:31.880154 | orchestrator | 2025-09-23 22:10:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:31.880299 | orchestrator | 2025-09-23 22:10:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:34.923020 | orchestrator | 2025-09-23 22:10:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:34.923987 | orchestrator | 2025-09-23 22:10:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:34.924229 | orchestrator | 2025-09-23 22:10:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:37.966571 | orchestrator | 2025-09-23 22:10:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:37.971566 | orchestrator | 2025-09-23 22:10:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:37.971627 | orchestrator | 2025-09-23 22:10:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:41.015198 | orchestrator | 2025-09-23 22:10:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:41.015884 | orchestrator | 2025-09-23 22:10:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:41.015920 | orchestrator | 2025-09-23 22:10:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:44.059668 | orchestrator | 2025-09-23 22:10:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:44.062340 | orchestrator | 2025-09-23 22:10:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:44.062373 | orchestrator | 2025-09-23 22:10:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:47.101560 | orchestrator | 2025-09-23 22:10:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:47.103160 | orchestrator | 2025-09-23 22:10:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:47.103276 | orchestrator | 2025-09-23 22:10:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:50.149840 | orchestrator | 2025-09-23 22:10:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:50.151480 | orchestrator | 2025-09-23 22:10:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:50.151533 | orchestrator | 2025-09-23 22:10:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:53.198965 | orchestrator | 2025-09-23 22:10:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:53.202762 | orchestrator | 2025-09-23 22:10:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:53.202815 | orchestrator | 2025-09-23 22:10:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:56.251478 | orchestrator | 2025-09-23 22:10:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:56.255577 | orchestrator | 2025-09-23 22:10:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:56.255876 | orchestrator | 2025-09-23 22:10:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:10:59.300765 | orchestrator | 2025-09-23 22:10:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:10:59.302352 | orchestrator | 2025-09-23 22:10:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:10:59.302435 | orchestrator | 2025-09-23 22:10:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:02.348063 | orchestrator | 2025-09-23 22:11:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:02.349508 | orchestrator | 2025-09-23 22:11:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:02.349538 | orchestrator | 2025-09-23 22:11:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:05.393997 | orchestrator | 2025-09-23 22:11:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:05.395848 | orchestrator | 2025-09-23 22:11:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:05.395967 | orchestrator | 2025-09-23 22:11:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:08.439627 | orchestrator | 2025-09-23 22:11:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:08.441500 | orchestrator | 2025-09-23 22:11:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:08.441549 | orchestrator | 2025-09-23 22:11:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:11.486681 | orchestrator | 2025-09-23 22:11:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:11.488106 | orchestrator | 2025-09-23 22:11:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:11.488143 | orchestrator | 2025-09-23 22:11:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:14.528353 | orchestrator | 2025-09-23 22:11:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:14.529231 | orchestrator | 2025-09-23 22:11:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:14.529283 | orchestrator | 2025-09-23 22:11:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:17.575880 | orchestrator | 2025-09-23 22:11:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:17.577508 | orchestrator | 2025-09-23 22:11:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:17.577553 | orchestrator | 2025-09-23 22:11:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:20.624581 | orchestrator | 2025-09-23 22:11:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:20.626190 | orchestrator | 2025-09-23 22:11:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:20.626468 | orchestrator | 2025-09-23 22:11:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:23.667954 | orchestrator | 2025-09-23 22:11:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:23.669911 | orchestrator | 2025-09-23 22:11:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:23.669942 | orchestrator | 2025-09-23 22:11:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:26.714545 | orchestrator | 2025-09-23 22:11:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:26.716896 | orchestrator | 2025-09-23 22:11:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:26.716961 | orchestrator | 2025-09-23 22:11:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:29.764993 | orchestrator | 2025-09-23 22:11:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:29.766476 | orchestrator | 2025-09-23 22:11:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:29.766521 | orchestrator | 2025-09-23 22:11:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:32.812026 | orchestrator | 2025-09-23 22:11:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:32.813036 | orchestrator | 2025-09-23 22:11:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:32.813097 | orchestrator | 2025-09-23 22:11:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:35.853057 | orchestrator | 2025-09-23 22:11:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:35.855999 | orchestrator | 2025-09-23 22:11:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:35.856031 | orchestrator | 2025-09-23 22:11:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:38.903538 | orchestrator | 2025-09-23 22:11:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:38.905934 | orchestrator | 2025-09-23 22:11:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:38.905963 | orchestrator | 2025-09-23 22:11:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:41.950624 | orchestrator | 2025-09-23 22:11:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:41.952103 | orchestrator | 2025-09-23 22:11:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:41.952150 | orchestrator | 2025-09-23 22:11:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:44.999718 | orchestrator | 2025-09-23 22:11:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:45.002086 | orchestrator | 2025-09-23 22:11:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:45.002120 | orchestrator | 2025-09-23 22:11:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:48.052569 | orchestrator | 2025-09-23 22:11:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:48.053667 | orchestrator | 2025-09-23 22:11:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:48.053693 | orchestrator | 2025-09-23 22:11:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:51.101019 | orchestrator | 2025-09-23 22:11:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:51.101960 | orchestrator | 2025-09-23 22:11:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:51.101991 | orchestrator | 2025-09-23 22:11:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:54.148482 | orchestrator | 2025-09-23 22:11:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:54.150399 | orchestrator | 2025-09-23 22:11:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:54.150449 | orchestrator | 2025-09-23 22:11:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:11:57.196462 | orchestrator | 2025-09-23 22:11:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:11:57.198640 | orchestrator | 2025-09-23 22:11:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:11:57.198817 | orchestrator | 2025-09-23 22:11:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:00.243017 | orchestrator | 2025-09-23 22:12:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:00.243863 | orchestrator | 2025-09-23 22:12:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:00.243889 | orchestrator | 2025-09-23 22:12:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:03.289508 | orchestrator | 2025-09-23 22:12:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:03.290507 | orchestrator | 2025-09-23 22:12:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:03.290677 | orchestrator | 2025-09-23 22:12:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:06.339991 | orchestrator | 2025-09-23 22:12:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:06.342376 | orchestrator | 2025-09-23 22:12:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:06.342407 | orchestrator | 2025-09-23 22:12:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:09.388616 | orchestrator | 2025-09-23 22:12:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:09.390225 | orchestrator | 2025-09-23 22:12:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:09.390288 | orchestrator | 2025-09-23 22:12:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:12.437441 | orchestrator | 2025-09-23 22:12:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:12.440381 | orchestrator | 2025-09-23 22:12:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:12.440419 | orchestrator | 2025-09-23 22:12:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:15.485494 | orchestrator | 2025-09-23 22:12:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:15.486661 | orchestrator | 2025-09-23 22:12:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:15.486689 | orchestrator | 2025-09-23 22:12:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:18.530100 | orchestrator | 2025-09-23 22:12:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:18.532147 | orchestrator | 2025-09-23 22:12:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:18.532176 | orchestrator | 2025-09-23 22:12:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:21.583730 | orchestrator | 2025-09-23 22:12:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:21.585525 | orchestrator | 2025-09-23 22:12:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:21.585558 | orchestrator | 2025-09-23 22:12:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:24.626803 | orchestrator | 2025-09-23 22:12:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:24.628422 | orchestrator | 2025-09-23 22:12:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:24.628451 | orchestrator | 2025-09-23 22:12:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:27.672611 | orchestrator | 2025-09-23 22:12:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:27.673936 | orchestrator | 2025-09-23 22:12:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:27.673970 | orchestrator | 2025-09-23 22:12:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:30.712436 | orchestrator | 2025-09-23 22:12:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:30.714299 | orchestrator | 2025-09-23 22:12:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:30.714555 | orchestrator | 2025-09-23 22:12:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:33.765205 | orchestrator | 2025-09-23 22:12:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:33.768034 | orchestrator | 2025-09-23 22:12:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:33.768114 | orchestrator | 2025-09-23 22:12:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:36.812172 | orchestrator | 2025-09-23 22:12:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:36.812799 | orchestrator | 2025-09-23 22:12:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:36.812826 | orchestrator | 2025-09-23 22:12:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:39.861334 | orchestrator | 2025-09-23 22:12:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:39.864663 | orchestrator | 2025-09-23 22:12:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:39.864723 | orchestrator | 2025-09-23 22:12:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:42.912517 | orchestrator | 2025-09-23 22:12:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:42.915346 | orchestrator | 2025-09-23 22:12:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:42.915385 | orchestrator | 2025-09-23 22:12:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:45.959409 | orchestrator | 2025-09-23 22:12:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:45.961617 | orchestrator | 2025-09-23 22:12:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:45.961664 | orchestrator | 2025-09-23 22:12:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:49.007351 | orchestrator | 2025-09-23 22:12:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:49.009036 | orchestrator | 2025-09-23 22:12:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:49.009256 | orchestrator | 2025-09-23 22:12:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:52.060032 | orchestrator | 2025-09-23 22:12:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:52.061179 | orchestrator | 2025-09-23 22:12:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:52.061210 | orchestrator | 2025-09-23 22:12:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:55.106311 | orchestrator | 2025-09-23 22:12:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:55.107553 | orchestrator | 2025-09-23 22:12:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:55.107597 | orchestrator | 2025-09-23 22:12:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:12:58.142811 | orchestrator | 2025-09-23 22:12:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:12:58.144957 | orchestrator | 2025-09-23 22:12:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:12:58.145431 | orchestrator | 2025-09-23 22:12:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:01.184722 | orchestrator | 2025-09-23 22:13:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:01.185779 | orchestrator | 2025-09-23 22:13:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:01.185853 | orchestrator | 2025-09-23 22:13:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:04.231567 | orchestrator | 2025-09-23 22:13:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:04.232687 | orchestrator | 2025-09-23 22:13:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:04.232732 | orchestrator | 2025-09-23 22:13:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:07.279703 | orchestrator | 2025-09-23 22:13:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:07.281329 | orchestrator | 2025-09-23 22:13:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:07.281455 | orchestrator | 2025-09-23 22:13:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:10.324926 | orchestrator | 2025-09-23 22:13:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:10.327881 | orchestrator | 2025-09-23 22:13:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:10.327991 | orchestrator | 2025-09-23 22:13:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:13.376024 | orchestrator | 2025-09-23 22:13:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:13.377287 | orchestrator | 2025-09-23 22:13:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:13.377318 | orchestrator | 2025-09-23 22:13:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:16.419096 | orchestrator | 2025-09-23 22:13:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:16.420828 | orchestrator | 2025-09-23 22:13:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:16.420857 | orchestrator | 2025-09-23 22:13:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:19.463801 | orchestrator | 2025-09-23 22:13:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:19.465952 | orchestrator | 2025-09-23 22:13:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:19.466006 | orchestrator | 2025-09-23 22:13:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:22.515030 | orchestrator | 2025-09-23 22:13:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:22.516884 | orchestrator | 2025-09-23 22:13:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:22.516930 | orchestrator | 2025-09-23 22:13:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:25.564569 | orchestrator | 2025-09-23 22:13:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:25.565581 | orchestrator | 2025-09-23 22:13:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:25.565802 | orchestrator | 2025-09-23 22:13:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:28.611606 | orchestrator | 2025-09-23 22:13:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:28.613503 | orchestrator | 2025-09-23 22:13:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:28.613533 | orchestrator | 2025-09-23 22:13:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:31.660209 | orchestrator | 2025-09-23 22:13:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:31.661805 | orchestrator | 2025-09-23 22:13:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:31.661890 | orchestrator | 2025-09-23 22:13:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:34.707651 | orchestrator | 2025-09-23 22:13:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:34.709405 | orchestrator | 2025-09-23 22:13:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:34.709485 | orchestrator | 2025-09-23 22:13:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:37.752917 | orchestrator | 2025-09-23 22:13:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:37.754243 | orchestrator | 2025-09-23 22:13:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:37.754742 | orchestrator | 2025-09-23 22:13:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:40.799659 | orchestrator | 2025-09-23 22:13:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:40.801383 | orchestrator | 2025-09-23 22:13:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:40.801425 | orchestrator | 2025-09-23 22:13:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:43.850424 | orchestrator | 2025-09-23 22:13:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:43.853028 | orchestrator | 2025-09-23 22:13:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:43.853206 | orchestrator | 2025-09-23 22:13:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:46.906302 | orchestrator | 2025-09-23 22:13:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:46.908272 | orchestrator | 2025-09-23 22:13:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:46.908375 | orchestrator | 2025-09-23 22:13:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:49.948167 | orchestrator | 2025-09-23 22:13:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:49.950361 | orchestrator | 2025-09-23 22:13:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:49.950443 | orchestrator | 2025-09-23 22:13:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:52.998384 | orchestrator | 2025-09-23 22:13:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:52.999640 | orchestrator | 2025-09-23 22:13:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:52.999662 | orchestrator | 2025-09-23 22:13:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:56.040673 | orchestrator | 2025-09-23 22:13:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:56.042379 | orchestrator | 2025-09-23 22:13:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:56.042566 | orchestrator | 2025-09-23 22:13:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:13:59.085365 | orchestrator | 2025-09-23 22:13:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:13:59.086734 | orchestrator | 2025-09-23 22:13:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:13:59.086786 | orchestrator | 2025-09-23 22:13:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:02.133626 | orchestrator | 2025-09-23 22:14:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:02.135666 | orchestrator | 2025-09-23 22:14:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:02.135704 | orchestrator | 2025-09-23 22:14:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:05.181413 | orchestrator | 2025-09-23 22:14:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:05.182978 | orchestrator | 2025-09-23 22:14:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:05.183018 | orchestrator | 2025-09-23 22:14:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:08.224309 | orchestrator | 2025-09-23 22:14:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:08.226291 | orchestrator | 2025-09-23 22:14:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:08.226539 | orchestrator | 2025-09-23 22:14:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:11.271764 | orchestrator | 2025-09-23 22:14:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:11.273383 | orchestrator | 2025-09-23 22:14:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:11.273589 | orchestrator | 2025-09-23 22:14:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:14.322842 | orchestrator | 2025-09-23 22:14:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:14.325018 | orchestrator | 2025-09-23 22:14:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:14.325089 | orchestrator | 2025-09-23 22:14:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:17.372846 | orchestrator | 2025-09-23 22:14:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:17.375558 | orchestrator | 2025-09-23 22:14:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:17.375596 | orchestrator | 2025-09-23 22:14:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:20.421527 | orchestrator | 2025-09-23 22:14:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:20.422860 | orchestrator | 2025-09-23 22:14:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:20.422899 | orchestrator | 2025-09-23 22:14:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:23.471686 | orchestrator | 2025-09-23 22:14:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:23.473522 | orchestrator | 2025-09-23 22:14:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:23.473569 | orchestrator | 2025-09-23 22:14:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:26.520563 | orchestrator | 2025-09-23 22:14:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:26.522078 | orchestrator | 2025-09-23 22:14:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:26.522115 | orchestrator | 2025-09-23 22:14:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:29.567160 | orchestrator | 2025-09-23 22:14:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:29.568284 | orchestrator | 2025-09-23 22:14:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:29.568342 | orchestrator | 2025-09-23 22:14:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:32.610731 | orchestrator | 2025-09-23 22:14:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:32.611657 | orchestrator | 2025-09-23 22:14:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:32.611763 | orchestrator | 2025-09-23 22:14:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:35.660021 | orchestrator | 2025-09-23 22:14:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:35.660848 | orchestrator | 2025-09-23 22:14:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:35.660969 | orchestrator | 2025-09-23 22:14:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:38.710609 | orchestrator | 2025-09-23 22:14:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:38.710786 | orchestrator | 2025-09-23 22:14:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:38.710806 | orchestrator | 2025-09-23 22:14:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:41.752454 | orchestrator | 2025-09-23 22:14:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:41.754478 | orchestrator | 2025-09-23 22:14:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:41.754578 | orchestrator | 2025-09-23 22:14:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:44.799282 | orchestrator | 2025-09-23 22:14:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:44.800413 | orchestrator | 2025-09-23 22:14:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:44.800447 | orchestrator | 2025-09-23 22:14:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:47.846828 | orchestrator | 2025-09-23 22:14:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:47.847790 | orchestrator | 2025-09-23 22:14:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:47.847852 | orchestrator | 2025-09-23 22:14:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:50.898335 | orchestrator | 2025-09-23 22:14:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:50.900143 | orchestrator | 2025-09-23 22:14:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:50.900301 | orchestrator | 2025-09-23 22:14:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:53.947835 | orchestrator | 2025-09-23 22:14:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:53.948988 | orchestrator | 2025-09-23 22:14:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:53.949173 | orchestrator | 2025-09-23 22:14:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:14:56.989775 | orchestrator | 2025-09-23 22:14:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:14:56.991854 | orchestrator | 2025-09-23 22:14:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:14:56.991909 | orchestrator | 2025-09-23 22:14:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:00.037648 | orchestrator | 2025-09-23 22:15:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:00.039982 | orchestrator | 2025-09-23 22:15:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:00.040139 | orchestrator | 2025-09-23 22:15:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:03.082262 | orchestrator | 2025-09-23 22:15:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:03.083103 | orchestrator | 2025-09-23 22:15:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:03.083186 | orchestrator | 2025-09-23 22:15:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:06.128189 | orchestrator | 2025-09-23 22:15:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:06.129796 | orchestrator | 2025-09-23 22:15:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:06.129843 | orchestrator | 2025-09-23 22:15:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:09.176095 | orchestrator | 2025-09-23 22:15:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:09.176662 | orchestrator | 2025-09-23 22:15:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:09.176826 | orchestrator | 2025-09-23 22:15:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:12.237539 | orchestrator | 2025-09-23 22:15:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:12.240292 | orchestrator | 2025-09-23 22:15:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:12.240337 | orchestrator | 2025-09-23 22:15:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:15.287976 | orchestrator | 2025-09-23 22:15:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:15.289152 | orchestrator | 2025-09-23 22:15:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:15.289183 | orchestrator | 2025-09-23 22:15:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:18.332354 | orchestrator | 2025-09-23 22:15:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:18.335135 | orchestrator | 2025-09-23 22:15:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:18.335177 | orchestrator | 2025-09-23 22:15:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:21.380059 | orchestrator | 2025-09-23 22:15:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:21.382287 | orchestrator | 2025-09-23 22:15:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:21.382606 | orchestrator | 2025-09-23 22:15:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:24.429110 | orchestrator | 2025-09-23 22:15:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:24.430724 | orchestrator | 2025-09-23 22:15:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:24.430979 | orchestrator | 2025-09-23 22:15:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:27.475179 | orchestrator | 2025-09-23 22:15:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:27.476799 | orchestrator | 2025-09-23 22:15:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:27.477117 | orchestrator | 2025-09-23 22:15:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:30.518502 | orchestrator | 2025-09-23 22:15:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:30.519233 | orchestrator | 2025-09-23 22:15:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:30.519279 | orchestrator | 2025-09-23 22:15:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:33.569541 | orchestrator | 2025-09-23 22:15:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:33.570566 | orchestrator | 2025-09-23 22:15:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:33.570598 | orchestrator | 2025-09-23 22:15:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:36.613925 | orchestrator | 2025-09-23 22:15:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:36.616968 | orchestrator | 2025-09-23 22:15:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:36.617176 | orchestrator | 2025-09-23 22:15:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:39.660236 | orchestrator | 2025-09-23 22:15:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:39.662390 | orchestrator | 2025-09-23 22:15:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:39.662477 | orchestrator | 2025-09-23 22:15:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:42.709701 | orchestrator | 2025-09-23 22:15:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:42.711595 | orchestrator | 2025-09-23 22:15:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:42.711643 | orchestrator | 2025-09-23 22:15:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:45.757618 | orchestrator | 2025-09-23 22:15:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:45.758672 | orchestrator | 2025-09-23 22:15:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:45.758792 | orchestrator | 2025-09-23 22:15:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:48.798657 | orchestrator | 2025-09-23 22:15:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:48.798986 | orchestrator | 2025-09-23 22:15:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:48.799066 | orchestrator | 2025-09-23 22:15:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:51.840440 | orchestrator | 2025-09-23 22:15:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:51.842147 | orchestrator | 2025-09-23 22:15:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:51.842234 | orchestrator | 2025-09-23 22:15:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:54.883735 | orchestrator | 2025-09-23 22:15:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:54.885228 | orchestrator | 2025-09-23 22:15:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:54.885260 | orchestrator | 2025-09-23 22:15:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:15:57.936048 | orchestrator | 2025-09-23 22:15:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:15:57.937208 | orchestrator | 2025-09-23 22:15:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:15:57.937239 | orchestrator | 2025-09-23 22:15:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:00.987259 | orchestrator | 2025-09-23 22:16:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:00.989049 | orchestrator | 2025-09-23 22:16:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:00.989167 | orchestrator | 2025-09-23 22:16:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:04.038880 | orchestrator | 2025-09-23 22:16:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:04.045532 | orchestrator | 2025-09-23 22:16:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:04.045616 | orchestrator | 2025-09-23 22:16:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:07.093403 | orchestrator | 2025-09-23 22:16:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:07.095034 | orchestrator | 2025-09-23 22:16:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:07.095352 | orchestrator | 2025-09-23 22:16:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:10.141038 | orchestrator | 2025-09-23 22:16:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:10.143117 | orchestrator | 2025-09-23 22:16:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:10.143163 | orchestrator | 2025-09-23 22:16:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:13.190872 | orchestrator | 2025-09-23 22:16:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:13.191698 | orchestrator | 2025-09-23 22:16:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:13.191729 | orchestrator | 2025-09-23 22:16:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:16.234873 | orchestrator | 2025-09-23 22:16:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:16.238650 | orchestrator | 2025-09-23 22:16:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:16.238739 | orchestrator | 2025-09-23 22:16:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:19.275695 | orchestrator | 2025-09-23 22:16:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:19.276608 | orchestrator | 2025-09-23 22:16:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:19.276660 | orchestrator | 2025-09-23 22:16:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:22.320941 | orchestrator | 2025-09-23 22:16:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:22.322274 | orchestrator | 2025-09-23 22:16:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:22.322570 | orchestrator | 2025-09-23 22:16:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:25.365377 | orchestrator | 2025-09-23 22:16:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:25.367359 | orchestrator | 2025-09-23 22:16:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:25.367449 | orchestrator | 2025-09-23 22:16:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:28.413943 | orchestrator | 2025-09-23 22:16:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:28.416278 | orchestrator | 2025-09-23 22:16:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:28.416341 | orchestrator | 2025-09-23 22:16:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:31.462318 | orchestrator | 2025-09-23 22:16:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:31.464070 | orchestrator | 2025-09-23 22:16:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:31.464101 | orchestrator | 2025-09-23 22:16:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:34.509465 | orchestrator | 2025-09-23 22:16:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:34.511154 | orchestrator | 2025-09-23 22:16:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:34.511234 | orchestrator | 2025-09-23 22:16:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:37.554326 | orchestrator | 2025-09-23 22:16:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:37.555613 | orchestrator | 2025-09-23 22:16:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:37.555641 | orchestrator | 2025-09-23 22:16:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:40.592753 | orchestrator | 2025-09-23 22:16:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:40.593726 | orchestrator | 2025-09-23 22:16:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:40.593937 | orchestrator | 2025-09-23 22:16:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:43.636341 | orchestrator | 2025-09-23 22:16:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:43.638291 | orchestrator | 2025-09-23 22:16:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:43.638331 | orchestrator | 2025-09-23 22:16:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:46.682421 | orchestrator | 2025-09-23 22:16:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:46.685102 | orchestrator | 2025-09-23 22:16:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:46.685132 | orchestrator | 2025-09-23 22:16:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:49.738011 | orchestrator | 2025-09-23 22:16:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:49.739309 | orchestrator | 2025-09-23 22:16:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:49.739346 | orchestrator | 2025-09-23 22:16:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:52.786510 | orchestrator | 2025-09-23 22:16:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:52.787935 | orchestrator | 2025-09-23 22:16:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:52.787973 | orchestrator | 2025-09-23 22:16:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:55.828397 | orchestrator | 2025-09-23 22:16:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:55.829382 | orchestrator | 2025-09-23 22:16:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:55.829423 | orchestrator | 2025-09-23 22:16:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:16:58.874591 | orchestrator | 2025-09-23 22:16:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:16:58.875998 | orchestrator | 2025-09-23 22:16:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:16:58.876137 | orchestrator | 2025-09-23 22:16:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:01.922094 | orchestrator | 2025-09-23 22:17:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:01.923903 | orchestrator | 2025-09-23 22:17:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:01.923996 | orchestrator | 2025-09-23 22:17:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:04.970399 | orchestrator | 2025-09-23 22:17:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:04.971852 | orchestrator | 2025-09-23 22:17:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:04.971900 | orchestrator | 2025-09-23 22:17:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:08.016857 | orchestrator | 2025-09-23 22:17:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:08.018954 | orchestrator | 2025-09-23 22:17:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:08.019358 | orchestrator | 2025-09-23 22:17:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:11.060136 | orchestrator | 2025-09-23 22:17:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:11.062119 | orchestrator | 2025-09-23 22:17:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:11.062196 | orchestrator | 2025-09-23 22:17:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:14.107366 | orchestrator | 2025-09-23 22:17:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:14.108587 | orchestrator | 2025-09-23 22:17:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:14.108659 | orchestrator | 2025-09-23 22:17:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:17.158242 | orchestrator | 2025-09-23 22:17:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:17.160146 | orchestrator | 2025-09-23 22:17:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:17.160234 | orchestrator | 2025-09-23 22:17:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:20.203447 | orchestrator | 2025-09-23 22:17:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:20.205705 | orchestrator | 2025-09-23 22:17:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:20.205735 | orchestrator | 2025-09-23 22:17:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:23.250316 | orchestrator | 2025-09-23 22:17:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:23.251875 | orchestrator | 2025-09-23 22:17:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:23.252029 | orchestrator | 2025-09-23 22:17:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:26.293367 | orchestrator | 2025-09-23 22:17:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:26.295872 | orchestrator | 2025-09-23 22:17:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:26.295946 | orchestrator | 2025-09-23 22:17:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:29.338105 | orchestrator | 2025-09-23 22:17:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:29.339631 | orchestrator | 2025-09-23 22:17:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:29.339963 | orchestrator | 2025-09-23 22:17:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:32.380998 | orchestrator | 2025-09-23 22:17:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:32.381708 | orchestrator | 2025-09-23 22:17:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:32.381751 | orchestrator | 2025-09-23 22:17:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:35.432145 | orchestrator | 2025-09-23 22:17:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:35.434137 | orchestrator | 2025-09-23 22:17:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:35.434252 | orchestrator | 2025-09-23 22:17:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:38.480694 | orchestrator | 2025-09-23 22:17:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:38.481810 | orchestrator | 2025-09-23 22:17:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:38.481893 | orchestrator | 2025-09-23 22:17:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:41.523388 | orchestrator | 2025-09-23 22:17:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:41.525031 | orchestrator | 2025-09-23 22:17:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:41.525120 | orchestrator | 2025-09-23 22:17:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:44.578743 | orchestrator | 2025-09-23 22:17:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:44.580341 | orchestrator | 2025-09-23 22:17:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:44.580417 | orchestrator | 2025-09-23 22:17:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:47.626688 | orchestrator | 2025-09-23 22:17:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:47.628484 | orchestrator | 2025-09-23 22:17:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:47.628532 | orchestrator | 2025-09-23 22:17:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:50.671944 | orchestrator | 2025-09-23 22:17:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:50.673547 | orchestrator | 2025-09-23 22:17:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:50.673611 | orchestrator | 2025-09-23 22:17:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:53.715659 | orchestrator | 2025-09-23 22:17:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:53.716542 | orchestrator | 2025-09-23 22:17:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:53.716595 | orchestrator | 2025-09-23 22:17:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:56.762255 | orchestrator | 2025-09-23 22:17:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:56.763384 | orchestrator | 2025-09-23 22:17:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:56.763414 | orchestrator | 2025-09-23 22:17:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:17:59.804712 | orchestrator | 2025-09-23 22:17:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:17:59.806680 | orchestrator | 2025-09-23 22:17:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:17:59.807005 | orchestrator | 2025-09-23 22:17:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:02.851021 | orchestrator | 2025-09-23 22:18:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:02.852708 | orchestrator | 2025-09-23 22:18:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:02.852804 | orchestrator | 2025-09-23 22:18:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:05.890095 | orchestrator | 2025-09-23 22:18:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:05.891370 | orchestrator | 2025-09-23 22:18:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:05.891401 | orchestrator | 2025-09-23 22:18:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:08.936959 | orchestrator | 2025-09-23 22:18:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:08.938860 | orchestrator | 2025-09-23 22:18:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:08.938937 | orchestrator | 2025-09-23 22:18:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:11.990715 | orchestrator | 2025-09-23 22:18:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:11.993440 | orchestrator | 2025-09-23 22:18:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:11.993476 | orchestrator | 2025-09-23 22:18:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:15.043974 | orchestrator | 2025-09-23 22:18:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:15.046503 | orchestrator | 2025-09-23 22:18:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:15.046800 | orchestrator | 2025-09-23 22:18:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:18.092878 | orchestrator | 2025-09-23 22:18:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:18.096122 | orchestrator | 2025-09-23 22:18:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:18.096192 | orchestrator | 2025-09-23 22:18:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:21.137452 | orchestrator | 2025-09-23 22:18:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:21.138511 | orchestrator | 2025-09-23 22:18:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:21.138553 | orchestrator | 2025-09-23 22:18:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:24.181695 | orchestrator | 2025-09-23 22:18:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:24.183458 | orchestrator | 2025-09-23 22:18:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:24.183545 | orchestrator | 2025-09-23 22:18:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:27.226494 | orchestrator | 2025-09-23 22:18:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:27.228380 | orchestrator | 2025-09-23 22:18:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:27.228581 | orchestrator | 2025-09-23 22:18:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:30.271091 | orchestrator | 2025-09-23 22:18:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:30.272262 | orchestrator | 2025-09-23 22:18:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:30.272310 | orchestrator | 2025-09-23 22:18:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:33.319101 | orchestrator | 2025-09-23 22:18:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:33.320118 | orchestrator | 2025-09-23 22:18:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:33.320150 | orchestrator | 2025-09-23 22:18:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:36.364781 | orchestrator | 2025-09-23 22:18:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:36.365827 | orchestrator | 2025-09-23 22:18:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:36.365911 | orchestrator | 2025-09-23 22:18:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:39.409793 | orchestrator | 2025-09-23 22:18:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:39.410144 | orchestrator | 2025-09-23 22:18:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:39.410176 | orchestrator | 2025-09-23 22:18:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:42.451241 | orchestrator | 2025-09-23 22:18:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:42.451510 | orchestrator | 2025-09-23 22:18:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:42.452250 | orchestrator | 2025-09-23 22:18:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:45.495967 | orchestrator | 2025-09-23 22:18:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:45.496547 | orchestrator | 2025-09-23 22:18:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:45.496736 | orchestrator | 2025-09-23 22:18:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:48.538611 | orchestrator | 2025-09-23 22:18:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:48.540175 | orchestrator | 2025-09-23 22:18:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:48.540210 | orchestrator | 2025-09-23 22:18:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:51.583382 | orchestrator | 2025-09-23 22:18:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:51.585344 | orchestrator | 2025-09-23 22:18:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:51.585373 | orchestrator | 2025-09-23 22:18:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:54.633270 | orchestrator | 2025-09-23 22:18:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:54.635613 | orchestrator | 2025-09-23 22:18:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:54.635707 | orchestrator | 2025-09-23 22:18:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:18:57.682510 | orchestrator | 2025-09-23 22:18:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:18:57.684676 | orchestrator | 2025-09-23 22:18:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:18:57.684707 | orchestrator | 2025-09-23 22:18:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:00.729642 | orchestrator | 2025-09-23 22:19:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:00.731480 | orchestrator | 2025-09-23 22:19:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:00.731554 | orchestrator | 2025-09-23 22:19:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:03.773857 | orchestrator | 2025-09-23 22:19:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:03.777232 | orchestrator | 2025-09-23 22:19:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:03.777852 | orchestrator | 2025-09-23 22:19:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:06.819075 | orchestrator | 2025-09-23 22:19:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:06.820398 | orchestrator | 2025-09-23 22:19:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:06.820562 | orchestrator | 2025-09-23 22:19:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:09.865523 | orchestrator | 2025-09-23 22:19:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:09.868248 | orchestrator | 2025-09-23 22:19:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:09.868390 | orchestrator | 2025-09-23 22:19:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:12.911053 | orchestrator | 2025-09-23 22:19:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:12.912719 | orchestrator | 2025-09-23 22:19:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:12.912749 | orchestrator | 2025-09-23 22:19:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:15.956870 | orchestrator | 2025-09-23 22:19:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:15.958816 | orchestrator | 2025-09-23 22:19:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:15.958847 | orchestrator | 2025-09-23 22:19:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:19.007823 | orchestrator | 2025-09-23 22:19:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:19.010169 | orchestrator | 2025-09-23 22:19:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:19.010259 | orchestrator | 2025-09-23 22:19:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:22.050228 | orchestrator | 2025-09-23 22:19:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:22.052164 | orchestrator | 2025-09-23 22:19:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:22.052206 | orchestrator | 2025-09-23 22:19:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:25.096230 | orchestrator | 2025-09-23 22:19:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:25.098120 | orchestrator | 2025-09-23 22:19:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:25.098240 | orchestrator | 2025-09-23 22:19:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:28.144060 | orchestrator | 2025-09-23 22:19:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:28.145758 | orchestrator | 2025-09-23 22:19:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:28.146143 | orchestrator | 2025-09-23 22:19:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:31.188511 | orchestrator | 2025-09-23 22:19:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:31.189663 | orchestrator | 2025-09-23 22:19:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:31.190242 | orchestrator | 2025-09-23 22:19:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:34.237003 | orchestrator | 2025-09-23 22:19:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:34.238307 | orchestrator | 2025-09-23 22:19:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:34.238344 | orchestrator | 2025-09-23 22:19:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:37.283626 | orchestrator | 2025-09-23 22:19:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:37.284645 | orchestrator | 2025-09-23 22:19:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:37.284726 | orchestrator | 2025-09-23 22:19:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:40.328258 | orchestrator | 2025-09-23 22:19:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:40.328846 | orchestrator | 2025-09-23 22:19:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:40.328878 | orchestrator | 2025-09-23 22:19:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:43.378276 | orchestrator | 2025-09-23 22:19:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:43.380287 | orchestrator | 2025-09-23 22:19:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:43.380331 | orchestrator | 2025-09-23 22:19:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:46.423865 | orchestrator | 2025-09-23 22:19:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:46.425359 | orchestrator | 2025-09-23 22:19:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:46.425427 | orchestrator | 2025-09-23 22:19:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:49.475225 | orchestrator | 2025-09-23 22:19:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:49.476848 | orchestrator | 2025-09-23 22:19:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:49.476880 | orchestrator | 2025-09-23 22:19:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:52.520680 | orchestrator | 2025-09-23 22:19:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:52.522004 | orchestrator | 2025-09-23 22:19:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:52.522267 | orchestrator | 2025-09-23 22:19:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:55.571354 | orchestrator | 2025-09-23 22:19:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:55.572620 | orchestrator | 2025-09-23 22:19:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:55.572662 | orchestrator | 2025-09-23 22:19:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:19:58.616183 | orchestrator | 2025-09-23 22:19:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:19:58.618387 | orchestrator | 2025-09-23 22:19:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:19:58.618780 | orchestrator | 2025-09-23 22:19:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:01.661568 | orchestrator | 2025-09-23 22:20:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:01.662784 | orchestrator | 2025-09-23 22:20:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:01.662867 | orchestrator | 2025-09-23 22:20:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:04.714642 | orchestrator | 2025-09-23 22:20:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:04.715719 | orchestrator | 2025-09-23 22:20:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:04.715746 | orchestrator | 2025-09-23 22:20:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:07.760700 | orchestrator | 2025-09-23 22:20:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:07.762447 | orchestrator | 2025-09-23 22:20:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:07.762507 | orchestrator | 2025-09-23 22:20:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:10.803607 | orchestrator | 2025-09-23 22:20:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:10.805684 | orchestrator | 2025-09-23 22:20:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:10.805714 | orchestrator | 2025-09-23 22:20:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:13.851760 | orchestrator | 2025-09-23 22:20:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:13.852759 | orchestrator | 2025-09-23 22:20:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:13.853715 | orchestrator | 2025-09-23 22:20:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:16.899681 | orchestrator | 2025-09-23 22:20:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:16.901273 | orchestrator | 2025-09-23 22:20:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:16.901301 | orchestrator | 2025-09-23 22:20:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:19.947075 | orchestrator | 2025-09-23 22:20:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:19.949054 | orchestrator | 2025-09-23 22:20:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:19.949084 | orchestrator | 2025-09-23 22:20:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:22.994370 | orchestrator | 2025-09-23 22:20:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:22.995811 | orchestrator | 2025-09-23 22:20:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:22.996013 | orchestrator | 2025-09-23 22:20:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:26.044963 | orchestrator | 2025-09-23 22:20:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:26.046533 | orchestrator | 2025-09-23 22:20:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:26.046812 | orchestrator | 2025-09-23 22:20:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:29.103847 | orchestrator | 2025-09-23 22:20:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:29.105810 | orchestrator | 2025-09-23 22:20:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:29.105896 | orchestrator | 2025-09-23 22:20:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:32.157788 | orchestrator | 2025-09-23 22:20:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:32.159445 | orchestrator | 2025-09-23 22:20:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:32.159614 | orchestrator | 2025-09-23 22:20:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:35.224074 | orchestrator | 2025-09-23 22:20:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:35.226387 | orchestrator | 2025-09-23 22:20:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:35.226525 | orchestrator | 2025-09-23 22:20:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:38.276767 | orchestrator | 2025-09-23 22:20:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:38.277875 | orchestrator | 2025-09-23 22:20:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:38.277905 | orchestrator | 2025-09-23 22:20:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:41.320919 | orchestrator | 2025-09-23 22:20:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:41.321293 | orchestrator | 2025-09-23 22:20:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:41.321356 | orchestrator | 2025-09-23 22:20:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:44.369319 | orchestrator | 2025-09-23 22:20:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:44.370353 | orchestrator | 2025-09-23 22:20:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:44.370591 | orchestrator | 2025-09-23 22:20:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:47.417254 | orchestrator | 2025-09-23 22:20:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:47.418590 | orchestrator | 2025-09-23 22:20:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:47.418708 | orchestrator | 2025-09-23 22:20:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:50.463032 | orchestrator | 2025-09-23 22:20:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:50.464336 | orchestrator | 2025-09-23 22:20:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:50.464467 | orchestrator | 2025-09-23 22:20:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:53.509319 | orchestrator | 2025-09-23 22:20:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:53.511587 | orchestrator | 2025-09-23 22:20:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:53.511685 | orchestrator | 2025-09-23 22:20:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:56.556440 | orchestrator | 2025-09-23 22:20:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:56.558352 | orchestrator | 2025-09-23 22:20:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:56.558455 | orchestrator | 2025-09-23 22:20:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:20:59.605814 | orchestrator | 2025-09-23 22:20:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:20:59.607801 | orchestrator | 2025-09-23 22:20:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:20:59.607949 | orchestrator | 2025-09-23 22:20:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:02.651201 | orchestrator | 2025-09-23 22:21:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:02.653017 | orchestrator | 2025-09-23 22:21:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:02.653045 | orchestrator | 2025-09-23 22:21:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:05.702147 | orchestrator | 2025-09-23 22:21:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:05.703102 | orchestrator | 2025-09-23 22:21:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:05.703141 | orchestrator | 2025-09-23 22:21:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:08.748111 | orchestrator | 2025-09-23 22:21:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:08.749180 | orchestrator | 2025-09-23 22:21:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:08.749305 | orchestrator | 2025-09-23 22:21:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:11.790956 | orchestrator | 2025-09-23 22:21:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:11.792102 | orchestrator | 2025-09-23 22:21:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:11.792135 | orchestrator | 2025-09-23 22:21:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:14.836494 | orchestrator | 2025-09-23 22:21:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:14.838333 | orchestrator | 2025-09-23 22:21:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:14.838367 | orchestrator | 2025-09-23 22:21:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:17.881505 | orchestrator | 2025-09-23 22:21:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:17.883400 | orchestrator | 2025-09-23 22:21:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:17.883498 | orchestrator | 2025-09-23 22:21:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:20.927334 | orchestrator | 2025-09-23 22:21:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:20.929422 | orchestrator | 2025-09-23 22:21:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:20.929494 | orchestrator | 2025-09-23 22:21:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:23.978133 | orchestrator | 2025-09-23 22:21:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:23.979238 | orchestrator | 2025-09-23 22:21:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:23.979277 | orchestrator | 2025-09-23 22:21:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:27.035154 | orchestrator | 2025-09-23 22:21:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:27.038856 | orchestrator | 2025-09-23 22:21:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:27.038919 | orchestrator | 2025-09-23 22:21:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:30.088245 | orchestrator | 2025-09-23 22:21:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:30.089590 | orchestrator | 2025-09-23 22:21:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:30.089620 | orchestrator | 2025-09-23 22:21:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:33.133226 | orchestrator | 2025-09-23 22:21:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:33.135666 | orchestrator | 2025-09-23 22:21:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:33.135698 | orchestrator | 2025-09-23 22:21:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:36.179875 | orchestrator | 2025-09-23 22:21:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:36.181448 | orchestrator | 2025-09-23 22:21:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:36.181496 | orchestrator | 2025-09-23 22:21:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:39.232524 | orchestrator | 2025-09-23 22:21:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:39.234224 | orchestrator | 2025-09-23 22:21:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:39.234257 | orchestrator | 2025-09-23 22:21:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:42.275021 | orchestrator | 2025-09-23 22:21:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:42.276448 | orchestrator | 2025-09-23 22:21:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:42.276495 | orchestrator | 2025-09-23 22:21:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:45.318651 | orchestrator | 2025-09-23 22:21:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:45.320752 | orchestrator | 2025-09-23 22:21:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:45.320802 | orchestrator | 2025-09-23 22:21:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:48.364903 | orchestrator | 2025-09-23 22:21:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:48.366131 | orchestrator | 2025-09-23 22:21:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:48.366172 | orchestrator | 2025-09-23 22:21:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:51.410933 | orchestrator | 2025-09-23 22:21:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:51.412627 | orchestrator | 2025-09-23 22:21:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:51.412659 | orchestrator | 2025-09-23 22:21:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:54.460502 | orchestrator | 2025-09-23 22:21:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:54.461458 | orchestrator | 2025-09-23 22:21:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:54.461665 | orchestrator | 2025-09-23 22:21:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:21:57.506821 | orchestrator | 2025-09-23 22:21:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:21:57.507795 | orchestrator | 2025-09-23 22:21:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:21:57.507880 | orchestrator | 2025-09-23 22:21:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:00.549019 | orchestrator | 2025-09-23 22:22:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:00.549375 | orchestrator | 2025-09-23 22:22:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:00.549409 | orchestrator | 2025-09-23 22:22:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:03.593048 | orchestrator | 2025-09-23 22:22:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:03.594323 | orchestrator | 2025-09-23 22:22:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:03.594358 | orchestrator | 2025-09-23 22:22:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:06.643173 | orchestrator | 2025-09-23 22:22:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:06.645534 | orchestrator | 2025-09-23 22:22:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:06.645699 | orchestrator | 2025-09-23 22:22:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:09.695955 | orchestrator | 2025-09-23 22:22:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:09.697253 | orchestrator | 2025-09-23 22:22:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:09.697301 | orchestrator | 2025-09-23 22:22:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:12.748044 | orchestrator | 2025-09-23 22:22:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:12.749426 | orchestrator | 2025-09-23 22:22:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:12.749503 | orchestrator | 2025-09-23 22:22:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:15.797515 | orchestrator | 2025-09-23 22:22:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:15.798933 | orchestrator | 2025-09-23 22:22:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:15.798965 | orchestrator | 2025-09-23 22:22:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:18.845974 | orchestrator | 2025-09-23 22:22:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:18.847306 | orchestrator | 2025-09-23 22:22:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:18.847338 | orchestrator | 2025-09-23 22:22:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:21.895889 | orchestrator | 2025-09-23 22:22:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:21.897535 | orchestrator | 2025-09-23 22:22:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:21.897710 | orchestrator | 2025-09-23 22:22:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:24.940970 | orchestrator | 2025-09-23 22:22:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:24.942969 | orchestrator | 2025-09-23 22:22:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:24.943036 | orchestrator | 2025-09-23 22:22:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:27.991058 | orchestrator | 2025-09-23 22:22:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:27.993075 | orchestrator | 2025-09-23 22:22:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:27.993108 | orchestrator | 2025-09-23 22:22:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:31.042386 | orchestrator | 2025-09-23 22:22:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:31.042997 | orchestrator | 2025-09-23 22:22:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:31.043033 | orchestrator | 2025-09-23 22:22:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:34.086323 | orchestrator | 2025-09-23 22:22:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:34.087229 | orchestrator | 2025-09-23 22:22:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:34.087317 | orchestrator | 2025-09-23 22:22:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:37.133415 | orchestrator | 2025-09-23 22:22:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:37.135131 | orchestrator | 2025-09-23 22:22:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:37.135167 | orchestrator | 2025-09-23 22:22:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:40.175915 | orchestrator | 2025-09-23 22:22:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:40.177930 | orchestrator | 2025-09-23 22:22:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:40.178247 | orchestrator | 2025-09-23 22:22:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:43.220122 | orchestrator | 2025-09-23 22:22:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:43.222075 | orchestrator | 2025-09-23 22:22:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:43.222281 | orchestrator | 2025-09-23 22:22:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:46.260719 | orchestrator | 2025-09-23 22:22:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:46.261983 | orchestrator | 2025-09-23 22:22:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:46.262160 | orchestrator | 2025-09-23 22:22:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:49.309197 | orchestrator | 2025-09-23 22:22:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:49.311236 | orchestrator | 2025-09-23 22:22:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:49.311300 | orchestrator | 2025-09-23 22:22:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:52.349930 | orchestrator | 2025-09-23 22:22:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:52.356302 | orchestrator | 2025-09-23 22:22:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:52.356367 | orchestrator | 2025-09-23 22:22:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:55.399179 | orchestrator | 2025-09-23 22:22:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:55.401914 | orchestrator | 2025-09-23 22:22:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:55.402427 | orchestrator | 2025-09-23 22:22:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:22:58.452492 | orchestrator | 2025-09-23 22:22:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:22:58.454416 | orchestrator | 2025-09-23 22:22:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:22:58.454450 | orchestrator | 2025-09-23 22:22:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:01.499350 | orchestrator | 2025-09-23 22:23:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:01.500882 | orchestrator | 2025-09-23 22:23:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:01.500934 | orchestrator | 2025-09-23 22:23:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:04.552746 | orchestrator | 2025-09-23 22:23:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:04.554000 | orchestrator | 2025-09-23 22:23:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:04.554104 | orchestrator | 2025-09-23 22:23:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:07.598735 | orchestrator | 2025-09-23 22:23:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:07.600533 | orchestrator | 2025-09-23 22:23:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:07.600627 | orchestrator | 2025-09-23 22:23:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:10.645420 | orchestrator | 2025-09-23 22:23:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:10.646819 | orchestrator | 2025-09-23 22:23:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:10.647109 | orchestrator | 2025-09-23 22:23:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:13.704567 | orchestrator | 2025-09-23 22:23:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:13.706503 | orchestrator | 2025-09-23 22:23:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:13.706622 | orchestrator | 2025-09-23 22:23:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:16.750497 | orchestrator | 2025-09-23 22:23:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:16.752165 | orchestrator | 2025-09-23 22:23:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:16.752196 | orchestrator | 2025-09-23 22:23:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:19.797249 | orchestrator | 2025-09-23 22:23:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:19.798538 | orchestrator | 2025-09-23 22:23:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:19.798574 | orchestrator | 2025-09-23 22:23:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:22.843549 | orchestrator | 2025-09-23 22:23:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:22.845334 | orchestrator | 2025-09-23 22:23:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:22.845372 | orchestrator | 2025-09-23 22:23:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:25.888794 | orchestrator | 2025-09-23 22:23:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:25.890351 | orchestrator | 2025-09-23 22:23:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:25.890446 | orchestrator | 2025-09-23 22:23:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:28.936905 | orchestrator | 2025-09-23 22:23:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:28.937999 | orchestrator | 2025-09-23 22:23:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:28.938056 | orchestrator | 2025-09-23 22:23:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:31.984056 | orchestrator | 2025-09-23 22:23:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:31.985465 | orchestrator | 2025-09-23 22:23:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:31.985503 | orchestrator | 2025-09-23 22:23:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:35.024502 | orchestrator | 2025-09-23 22:23:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:35.026286 | orchestrator | 2025-09-23 22:23:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:35.026537 | orchestrator | 2025-09-23 22:23:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:38.076533 | orchestrator | 2025-09-23 22:23:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:38.078275 | orchestrator | 2025-09-23 22:23:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:38.078613 | orchestrator | 2025-09-23 22:23:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:41.124202 | orchestrator | 2025-09-23 22:23:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:41.125926 | orchestrator | 2025-09-23 22:23:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:41.125951 | orchestrator | 2025-09-23 22:23:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:44.176560 | orchestrator | 2025-09-23 22:23:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:44.177953 | orchestrator | 2025-09-23 22:23:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:44.177986 | orchestrator | 2025-09-23 22:23:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:47.219638 | orchestrator | 2025-09-23 22:23:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:47.220352 | orchestrator | 2025-09-23 22:23:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:47.220381 | orchestrator | 2025-09-23 22:23:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:50.263149 | orchestrator | 2025-09-23 22:23:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:50.264452 | orchestrator | 2025-09-23 22:23:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:50.264489 | orchestrator | 2025-09-23 22:23:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:53.309332 | orchestrator | 2025-09-23 22:23:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:53.311070 | orchestrator | 2025-09-23 22:23:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:53.311103 | orchestrator | 2025-09-23 22:23:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:56.356936 | orchestrator | 2025-09-23 22:23:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:56.359253 | orchestrator | 2025-09-23 22:23:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:56.359287 | orchestrator | 2025-09-23 22:23:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:23:59.406163 | orchestrator | 2025-09-23 22:23:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:23:59.407011 | orchestrator | 2025-09-23 22:23:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:23:59.407042 | orchestrator | 2025-09-23 22:23:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:02.456113 | orchestrator | 2025-09-23 22:24:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:02.457816 | orchestrator | 2025-09-23 22:24:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:02.458162 | orchestrator | 2025-09-23 22:24:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:05.503840 | orchestrator | 2025-09-23 22:24:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:05.505910 | orchestrator | 2025-09-23 22:24:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:05.506302 | orchestrator | 2025-09-23 22:24:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:08.550856 | orchestrator | 2025-09-23 22:24:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:08.552369 | orchestrator | 2025-09-23 22:24:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:08.552402 | orchestrator | 2025-09-23 22:24:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:11.595029 | orchestrator | 2025-09-23 22:24:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:11.595975 | orchestrator | 2025-09-23 22:24:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:11.596060 | orchestrator | 2025-09-23 22:24:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:14.640731 | orchestrator | 2025-09-23 22:24:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:14.642509 | orchestrator | 2025-09-23 22:24:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:14.642684 | orchestrator | 2025-09-23 22:24:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:17.685104 | orchestrator | 2025-09-23 22:24:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:17.687297 | orchestrator | 2025-09-23 22:24:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:17.687331 | orchestrator | 2025-09-23 22:24:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:20.728945 | orchestrator | 2025-09-23 22:24:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:20.730328 | orchestrator | 2025-09-23 22:24:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:20.730365 | orchestrator | 2025-09-23 22:24:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:23.777654 | orchestrator | 2025-09-23 22:24:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:23.778743 | orchestrator | 2025-09-23 22:24:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:23.778890 | orchestrator | 2025-09-23 22:24:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:26.824220 | orchestrator | 2025-09-23 22:24:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:26.825587 | orchestrator | 2025-09-23 22:24:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:26.825725 | orchestrator | 2025-09-23 22:24:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:29.871577 | orchestrator | 2025-09-23 22:24:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:29.872463 | orchestrator | 2025-09-23 22:24:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:29.872508 | orchestrator | 2025-09-23 22:24:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:32.919471 | orchestrator | 2025-09-23 22:24:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:32.920841 | orchestrator | 2025-09-23 22:24:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:32.920872 | orchestrator | 2025-09-23 22:24:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:35.963420 | orchestrator | 2025-09-23 22:24:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:35.965138 | orchestrator | 2025-09-23 22:24:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:35.965170 | orchestrator | 2025-09-23 22:24:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:39.012851 | orchestrator | 2025-09-23 22:24:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:39.014263 | orchestrator | 2025-09-23 22:24:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:39.014313 | orchestrator | 2025-09-23 22:24:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:42.051268 | orchestrator | 2025-09-23 22:24:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:42.051704 | orchestrator | 2025-09-23 22:24:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:42.051755 | orchestrator | 2025-09-23 22:24:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:45.097867 | orchestrator | 2025-09-23 22:24:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:45.099039 | orchestrator | 2025-09-23 22:24:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:45.099077 | orchestrator | 2025-09-23 22:24:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:48.141316 | orchestrator | 2025-09-23 22:24:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:48.143099 | orchestrator | 2025-09-23 22:24:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:48.143324 | orchestrator | 2025-09-23 22:24:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:51.183270 | orchestrator | 2025-09-23 22:24:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:51.185455 | orchestrator | 2025-09-23 22:24:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:51.185729 | orchestrator | 2025-09-23 22:24:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:54.226827 | orchestrator | 2025-09-23 22:24:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:54.229075 | orchestrator | 2025-09-23 22:24:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:54.229157 | orchestrator | 2025-09-23 22:24:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:24:57.275056 | orchestrator | 2025-09-23 22:24:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:24:57.276123 | orchestrator | 2025-09-23 22:24:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:24:57.276171 | orchestrator | 2025-09-23 22:24:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:00.324254 | orchestrator | 2025-09-23 22:25:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:00.328418 | orchestrator | 2025-09-23 22:25:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:00.328658 | orchestrator | 2025-09-23 22:25:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:03.375194 | orchestrator | 2025-09-23 22:25:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:03.376566 | orchestrator | 2025-09-23 22:25:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:03.376688 | orchestrator | 2025-09-23 22:25:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:06.430220 | orchestrator | 2025-09-23 22:25:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:06.432565 | orchestrator | 2025-09-23 22:25:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:06.432663 | orchestrator | 2025-09-23 22:25:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:09.481199 | orchestrator | 2025-09-23 22:25:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:09.482983 | orchestrator | 2025-09-23 22:25:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:09.483063 | orchestrator | 2025-09-23 22:25:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:12.525303 | orchestrator | 2025-09-23 22:25:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:12.527676 | orchestrator | 2025-09-23 22:25:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:12.527706 | orchestrator | 2025-09-23 22:25:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:15.575507 | orchestrator | 2025-09-23 22:25:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:15.577388 | orchestrator | 2025-09-23 22:25:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:15.577420 | orchestrator | 2025-09-23 22:25:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:18.624867 | orchestrator | 2025-09-23 22:25:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:18.625840 | orchestrator | 2025-09-23 22:25:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:18.625912 | orchestrator | 2025-09-23 22:25:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:21.666711 | orchestrator | 2025-09-23 22:25:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:21.667983 | orchestrator | 2025-09-23 22:25:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:21.668162 | orchestrator | 2025-09-23 22:25:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:24.712962 | orchestrator | 2025-09-23 22:25:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:24.714211 | orchestrator | 2025-09-23 22:25:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:24.714263 | orchestrator | 2025-09-23 22:25:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:27.760121 | orchestrator | 2025-09-23 22:25:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:27.762277 | orchestrator | 2025-09-23 22:25:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:27.762313 | orchestrator | 2025-09-23 22:25:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:30.806297 | orchestrator | 2025-09-23 22:25:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:30.808090 | orchestrator | 2025-09-23 22:25:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:30.808119 | orchestrator | 2025-09-23 22:25:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:33.850411 | orchestrator | 2025-09-23 22:25:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:33.852875 | orchestrator | 2025-09-23 22:25:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:33.853050 | orchestrator | 2025-09-23 22:25:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:36.897105 | orchestrator | 2025-09-23 22:25:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:36.898127 | orchestrator | 2025-09-23 22:25:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:36.898159 | orchestrator | 2025-09-23 22:25:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:39.944809 | orchestrator | 2025-09-23 22:25:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:39.945616 | orchestrator | 2025-09-23 22:25:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:39.945694 | orchestrator | 2025-09-23 22:25:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:42.987233 | orchestrator | 2025-09-23 22:25:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:42.989024 | orchestrator | 2025-09-23 22:25:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:42.989106 | orchestrator | 2025-09-23 22:25:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:46.028889 | orchestrator | 2025-09-23 22:25:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:46.030186 | orchestrator | 2025-09-23 22:25:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:46.030251 | orchestrator | 2025-09-23 22:25:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:49.076723 | orchestrator | 2025-09-23 22:25:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:49.079133 | orchestrator | 2025-09-23 22:25:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:49.079172 | orchestrator | 2025-09-23 22:25:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:52.126001 | orchestrator | 2025-09-23 22:25:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:52.127719 | orchestrator | 2025-09-23 22:25:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:52.127755 | orchestrator | 2025-09-23 22:25:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:55.172947 | orchestrator | 2025-09-23 22:25:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:55.174138 | orchestrator | 2025-09-23 22:25:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:55.174202 | orchestrator | 2025-09-23 22:25:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:25:58.219279 | orchestrator | 2025-09-23 22:25:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:25:58.220521 | orchestrator | 2025-09-23 22:25:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:25:58.220566 | orchestrator | 2025-09-23 22:25:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:01.262658 | orchestrator | 2025-09-23 22:26:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:01.264880 | orchestrator | 2025-09-23 22:26:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:01.264930 | orchestrator | 2025-09-23 22:26:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:04.311671 | orchestrator | 2025-09-23 22:26:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:04.313016 | orchestrator | 2025-09-23 22:26:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:04.313235 | orchestrator | 2025-09-23 22:26:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:07.358478 | orchestrator | 2025-09-23 22:26:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:07.359963 | orchestrator | 2025-09-23 22:26:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:07.359996 | orchestrator | 2025-09-23 22:26:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:10.403221 | orchestrator | 2025-09-23 22:26:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:10.405790 | orchestrator | 2025-09-23 22:26:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:10.405821 | orchestrator | 2025-09-23 22:26:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:13.453026 | orchestrator | 2025-09-23 22:26:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:13.454735 | orchestrator | 2025-09-23 22:26:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:13.454771 | orchestrator | 2025-09-23 22:26:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:16.501270 | orchestrator | 2025-09-23 22:26:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:16.504760 | orchestrator | 2025-09-23 22:26:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:16.504871 | orchestrator | 2025-09-23 22:26:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:19.550695 | orchestrator | 2025-09-23 22:26:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:19.552978 | orchestrator | 2025-09-23 22:26:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:19.553026 | orchestrator | 2025-09-23 22:26:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:22.599785 | orchestrator | 2025-09-23 22:26:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:22.601267 | orchestrator | 2025-09-23 22:26:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:22.601301 | orchestrator | 2025-09-23 22:26:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:25.646280 | orchestrator | 2025-09-23 22:26:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:25.647899 | orchestrator | 2025-09-23 22:26:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:25.647934 | orchestrator | 2025-09-23 22:26:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:28.689262 | orchestrator | 2025-09-23 22:26:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:28.691042 | orchestrator | 2025-09-23 22:26:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:28.691106 | orchestrator | 2025-09-23 22:26:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:31.736134 | orchestrator | 2025-09-23 22:26:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:31.738318 | orchestrator | 2025-09-23 22:26:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:31.738378 | orchestrator | 2025-09-23 22:26:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:34.795644 | orchestrator | 2025-09-23 22:26:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:34.796814 | orchestrator | 2025-09-23 22:26:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:34.797313 | orchestrator | 2025-09-23 22:26:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:37.846218 | orchestrator | 2025-09-23 22:26:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:37.847062 | orchestrator | 2025-09-23 22:26:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:37.847144 | orchestrator | 2025-09-23 22:26:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:40.888052 | orchestrator | 2025-09-23 22:26:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:40.889540 | orchestrator | 2025-09-23 22:26:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:40.889575 | orchestrator | 2025-09-23 22:26:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:43.932588 | orchestrator | 2025-09-23 22:26:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:43.934550 | orchestrator | 2025-09-23 22:26:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:43.934648 | orchestrator | 2025-09-23 22:26:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:46.982418 | orchestrator | 2025-09-23 22:26:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:46.983937 | orchestrator | 2025-09-23 22:26:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:46.984109 | orchestrator | 2025-09-23 22:26:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:50.031983 | orchestrator | 2025-09-23 22:26:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:50.033010 | orchestrator | 2025-09-23 22:26:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:50.033081 | orchestrator | 2025-09-23 22:26:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:53.073682 | orchestrator | 2025-09-23 22:26:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:53.075429 | orchestrator | 2025-09-23 22:26:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:53.075468 | orchestrator | 2025-09-23 22:26:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:56.114173 | orchestrator | 2025-09-23 22:26:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:56.115129 | orchestrator | 2025-09-23 22:26:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:56.115159 | orchestrator | 2025-09-23 22:26:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:26:59.159408 | orchestrator | 2025-09-23 22:26:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:26:59.161339 | orchestrator | 2025-09-23 22:26:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:26:59.161392 | orchestrator | 2025-09-23 22:26:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:02.219549 | orchestrator | 2025-09-23 22:27:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:02.221470 | orchestrator | 2025-09-23 22:27:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:02.221514 | orchestrator | 2025-09-23 22:27:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:05.272653 | orchestrator | 2025-09-23 22:27:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:05.273874 | orchestrator | 2025-09-23 22:27:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:05.273946 | orchestrator | 2025-09-23 22:27:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:08.320685 | orchestrator | 2025-09-23 22:27:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:08.322524 | orchestrator | 2025-09-23 22:27:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:08.322691 | orchestrator | 2025-09-23 22:27:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:11.368401 | orchestrator | 2025-09-23 22:27:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:11.370201 | orchestrator | 2025-09-23 22:27:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:11.370266 | orchestrator | 2025-09-23 22:27:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:14.409884 | orchestrator | 2025-09-23 22:27:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:14.410802 | orchestrator | 2025-09-23 22:27:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:14.410837 | orchestrator | 2025-09-23 22:27:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:17.457472 | orchestrator | 2025-09-23 22:27:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:17.459765 | orchestrator | 2025-09-23 22:27:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:17.460406 | orchestrator | 2025-09-23 22:27:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:20.510330 | orchestrator | 2025-09-23 22:27:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:20.511001 | orchestrator | 2025-09-23 22:27:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:20.511042 | orchestrator | 2025-09-23 22:27:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:23.564141 | orchestrator | 2025-09-23 22:27:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:23.565951 | orchestrator | 2025-09-23 22:27:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:23.565981 | orchestrator | 2025-09-23 22:27:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:26.612314 | orchestrator | 2025-09-23 22:27:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:26.612618 | orchestrator | 2025-09-23 22:27:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:26.612646 | orchestrator | 2025-09-23 22:27:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:29.657499 | orchestrator | 2025-09-23 22:27:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:29.659203 | orchestrator | 2025-09-23 22:27:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:29.659281 | orchestrator | 2025-09-23 22:27:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:32.703050 | orchestrator | 2025-09-23 22:27:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:32.704090 | orchestrator | 2025-09-23 22:27:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:32.704144 | orchestrator | 2025-09-23 22:27:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:35.748595 | orchestrator | 2025-09-23 22:27:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:35.749774 | orchestrator | 2025-09-23 22:27:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:35.749813 | orchestrator | 2025-09-23 22:27:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:38.796536 | orchestrator | 2025-09-23 22:27:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:38.798247 | orchestrator | 2025-09-23 22:27:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:38.798281 | orchestrator | 2025-09-23 22:27:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:41.844980 | orchestrator | 2025-09-23 22:27:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:41.846144 | orchestrator | 2025-09-23 22:27:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:41.846184 | orchestrator | 2025-09-23 22:27:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:44.895737 | orchestrator | 2025-09-23 22:27:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:44.896826 | orchestrator | 2025-09-23 22:27:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:44.896862 | orchestrator | 2025-09-23 22:27:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:47.941380 | orchestrator | 2025-09-23 22:27:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:47.942837 | orchestrator | 2025-09-23 22:27:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:47.943034 | orchestrator | 2025-09-23 22:27:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:50.988517 | orchestrator | 2025-09-23 22:27:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:50.989417 | orchestrator | 2025-09-23 22:27:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:50.989676 | orchestrator | 2025-09-23 22:27:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:54.036637 | orchestrator | 2025-09-23 22:27:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:54.038503 | orchestrator | 2025-09-23 22:27:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:54.038543 | orchestrator | 2025-09-23 22:27:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:27:57.080295 | orchestrator | 2025-09-23 22:27:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:27:57.082214 | orchestrator | 2025-09-23 22:27:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:27:57.082422 | orchestrator | 2025-09-23 22:27:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:00.124395 | orchestrator | 2025-09-23 22:28:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:00.126177 | orchestrator | 2025-09-23 22:28:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:00.126204 | orchestrator | 2025-09-23 22:28:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:03.166320 | orchestrator | 2025-09-23 22:28:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:03.167713 | orchestrator | 2025-09-23 22:28:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:03.168031 | orchestrator | 2025-09-23 22:28:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:06.216489 | orchestrator | 2025-09-23 22:28:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:06.219924 | orchestrator | 2025-09-23 22:28:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:06.220265 | orchestrator | 2025-09-23 22:28:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:09.270528 | orchestrator | 2025-09-23 22:28:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:09.272194 | orchestrator | 2025-09-23 22:28:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:09.272250 | orchestrator | 2025-09-23 22:28:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:12.316134 | orchestrator | 2025-09-23 22:28:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:12.317328 | orchestrator | 2025-09-23 22:28:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:12.317362 | orchestrator | 2025-09-23 22:28:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:15.363426 | orchestrator | 2025-09-23 22:28:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:15.364515 | orchestrator | 2025-09-23 22:28:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:15.364546 | orchestrator | 2025-09-23 22:28:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:18.409867 | orchestrator | 2025-09-23 22:28:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:18.411055 | orchestrator | 2025-09-23 22:28:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:18.411099 | orchestrator | 2025-09-23 22:28:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:21.457722 | orchestrator | 2025-09-23 22:28:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:21.459202 | orchestrator | 2025-09-23 22:28:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:21.459717 | orchestrator | 2025-09-23 22:28:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:24.504408 | orchestrator | 2025-09-23 22:28:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:24.507020 | orchestrator | 2025-09-23 22:28:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:24.507065 | orchestrator | 2025-09-23 22:28:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:27.546079 | orchestrator | 2025-09-23 22:28:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:27.547397 | orchestrator | 2025-09-23 22:28:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:27.547429 | orchestrator | 2025-09-23 22:28:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:30.590351 | orchestrator | 2025-09-23 22:28:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:30.591994 | orchestrator | 2025-09-23 22:28:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:30.592328 | orchestrator | 2025-09-23 22:28:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:33.635823 | orchestrator | 2025-09-23 22:28:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:33.637197 | orchestrator | 2025-09-23 22:28:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:33.637241 | orchestrator | 2025-09-23 22:28:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:36.686201 | orchestrator | 2025-09-23 22:28:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:36.687290 | orchestrator | 2025-09-23 22:28:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:36.687374 | orchestrator | 2025-09-23 22:28:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:39.729419 | orchestrator | 2025-09-23 22:28:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:39.730763 | orchestrator | 2025-09-23 22:28:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:39.730829 | orchestrator | 2025-09-23 22:28:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:42.775357 | orchestrator | 2025-09-23 22:28:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:42.777130 | orchestrator | 2025-09-23 22:28:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:42.777163 | orchestrator | 2025-09-23 22:28:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:45.817489 | orchestrator | 2025-09-23 22:28:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:45.819170 | orchestrator | 2025-09-23 22:28:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:45.819210 | orchestrator | 2025-09-23 22:28:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:48.865810 | orchestrator | 2025-09-23 22:28:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:48.867571 | orchestrator | 2025-09-23 22:28:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:48.867622 | orchestrator | 2025-09-23 22:28:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:51.918375 | orchestrator | 2025-09-23 22:28:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:51.922127 | orchestrator | 2025-09-23 22:28:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:51.922161 | orchestrator | 2025-09-23 22:28:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:54.971023 | orchestrator | 2025-09-23 22:28:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:54.972700 | orchestrator | 2025-09-23 22:28:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:54.972728 | orchestrator | 2025-09-23 22:28:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:28:58.020379 | orchestrator | 2025-09-23 22:28:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:28:58.021869 | orchestrator | 2025-09-23 22:28:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:28:58.021918 | orchestrator | 2025-09-23 22:28:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:01.064444 | orchestrator | 2025-09-23 22:29:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:01.065608 | orchestrator | 2025-09-23 22:29:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:01.065643 | orchestrator | 2025-09-23 22:29:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:04.107119 | orchestrator | 2025-09-23 22:29:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:04.107995 | orchestrator | 2025-09-23 22:29:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:04.108028 | orchestrator | 2025-09-23 22:29:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:07.153270 | orchestrator | 2025-09-23 22:29:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:07.155044 | orchestrator | 2025-09-23 22:29:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:07.155076 | orchestrator | 2025-09-23 22:29:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:10.193295 | orchestrator | 2025-09-23 22:29:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:10.194930 | orchestrator | 2025-09-23 22:29:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:10.195051 | orchestrator | 2025-09-23 22:29:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:13.240365 | orchestrator | 2025-09-23 22:29:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:13.242303 | orchestrator | 2025-09-23 22:29:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:13.242339 | orchestrator | 2025-09-23 22:29:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:16.283664 | orchestrator | 2025-09-23 22:29:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:16.285624 | orchestrator | 2025-09-23 22:29:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:16.285699 | orchestrator | 2025-09-23 22:29:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:19.331293 | orchestrator | 2025-09-23 22:29:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:19.333157 | orchestrator | 2025-09-23 22:29:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:19.333194 | orchestrator | 2025-09-23 22:29:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:22.381558 | orchestrator | 2025-09-23 22:29:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:22.383926 | orchestrator | 2025-09-23 22:29:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:22.384209 | orchestrator | 2025-09-23 22:29:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:25.432786 | orchestrator | 2025-09-23 22:29:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:25.434741 | orchestrator | 2025-09-23 22:29:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:25.434783 | orchestrator | 2025-09-23 22:29:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:28.480174 | orchestrator | 2025-09-23 22:29:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:28.481253 | orchestrator | 2025-09-23 22:29:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:28.481765 | orchestrator | 2025-09-23 22:29:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:31.528592 | orchestrator | 2025-09-23 22:29:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:31.528929 | orchestrator | 2025-09-23 22:29:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:31.528960 | orchestrator | 2025-09-23 22:29:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:34.576067 | orchestrator | 2025-09-23 22:29:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:34.577611 | orchestrator | 2025-09-23 22:29:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:34.577727 | orchestrator | 2025-09-23 22:29:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:37.625305 | orchestrator | 2025-09-23 22:29:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:37.627487 | orchestrator | 2025-09-23 22:29:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:37.627640 | orchestrator | 2025-09-23 22:29:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:40.674186 | orchestrator | 2025-09-23 22:29:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:40.675243 | orchestrator | 2025-09-23 22:29:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:40.675291 | orchestrator | 2025-09-23 22:29:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:43.717808 | orchestrator | 2025-09-23 22:29:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:43.719368 | orchestrator | 2025-09-23 22:29:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:43.719406 | orchestrator | 2025-09-23 22:29:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:46.768807 | orchestrator | 2025-09-23 22:29:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:46.770231 | orchestrator | 2025-09-23 22:29:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:46.770267 | orchestrator | 2025-09-23 22:29:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:49.817313 | orchestrator | 2025-09-23 22:29:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:49.819249 | orchestrator | 2025-09-23 22:29:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:49.819398 | orchestrator | 2025-09-23 22:29:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:52.867071 | orchestrator | 2025-09-23 22:29:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:52.868627 | orchestrator | 2025-09-23 22:29:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:52.868753 | orchestrator | 2025-09-23 22:29:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:55.913557 | orchestrator | 2025-09-23 22:29:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:55.914687 | orchestrator | 2025-09-23 22:29:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:55.914724 | orchestrator | 2025-09-23 22:29:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:29:58.961902 | orchestrator | 2025-09-23 22:29:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:29:58.964056 | orchestrator | 2025-09-23 22:29:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:29:58.964588 | orchestrator | 2025-09-23 22:29:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:02.006121 | orchestrator | 2025-09-23 22:30:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:02.007234 | orchestrator | 2025-09-23 22:30:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:02.007268 | orchestrator | 2025-09-23 22:30:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:05.057211 | orchestrator | 2025-09-23 22:30:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:05.059721 | orchestrator | 2025-09-23 22:30:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:05.059810 | orchestrator | 2025-09-23 22:30:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:08.094101 | orchestrator | 2025-09-23 22:30:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:08.095190 | orchestrator | 2025-09-23 22:30:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:08.095296 | orchestrator | 2025-09-23 22:30:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:11.134255 | orchestrator | 2025-09-23 22:30:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:11.136191 | orchestrator | 2025-09-23 22:30:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:11.136231 | orchestrator | 2025-09-23 22:30:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:14.176186 | orchestrator | 2025-09-23 22:30:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:14.177913 | orchestrator | 2025-09-23 22:30:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:14.177994 | orchestrator | 2025-09-23 22:30:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:17.220468 | orchestrator | 2025-09-23 22:30:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:17.222281 | orchestrator | 2025-09-23 22:30:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:17.222317 | orchestrator | 2025-09-23 22:30:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:20.274311 | orchestrator | 2025-09-23 22:30:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:20.276306 | orchestrator | 2025-09-23 22:30:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:20.276402 | orchestrator | 2025-09-23 22:30:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:23.322813 | orchestrator | 2025-09-23 22:30:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:23.323931 | orchestrator | 2025-09-23 22:30:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:23.324114 | orchestrator | 2025-09-23 22:30:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:26.367157 | orchestrator | 2025-09-23 22:30:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:26.368134 | orchestrator | 2025-09-23 22:30:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:26.368172 | orchestrator | 2025-09-23 22:30:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:29.410946 | orchestrator | 2025-09-23 22:30:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:29.412723 | orchestrator | 2025-09-23 22:30:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:29.412762 | orchestrator | 2025-09-23 22:30:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:32.454336 | orchestrator | 2025-09-23 22:30:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:32.455929 | orchestrator | 2025-09-23 22:30:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:32.456045 | orchestrator | 2025-09-23 22:30:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:35.500156 | orchestrator | 2025-09-23 22:30:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:35.503999 | orchestrator | 2025-09-23 22:30:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:35.504088 | orchestrator | 2025-09-23 22:30:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:38.551410 | orchestrator | 2025-09-23 22:30:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:38.552610 | orchestrator | 2025-09-23 22:30:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:38.552642 | orchestrator | 2025-09-23 22:30:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:41.597604 | orchestrator | 2025-09-23 22:30:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:41.600084 | orchestrator | 2025-09-23 22:30:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:41.600229 | orchestrator | 2025-09-23 22:30:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:44.644869 | orchestrator | 2025-09-23 22:30:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:44.645795 | orchestrator | 2025-09-23 22:30:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:44.646298 | orchestrator | 2025-09-23 22:30:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:47.692413 | orchestrator | 2025-09-23 22:30:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:47.694404 | orchestrator | 2025-09-23 22:30:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:47.694499 | orchestrator | 2025-09-23 22:30:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:50.737860 | orchestrator | 2025-09-23 22:30:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:50.739263 | orchestrator | 2025-09-23 22:30:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:50.739295 | orchestrator | 2025-09-23 22:30:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:53.785195 | orchestrator | 2025-09-23 22:30:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:53.786326 | orchestrator | 2025-09-23 22:30:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:53.786405 | orchestrator | 2025-09-23 22:30:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:56.831300 | orchestrator | 2025-09-23 22:30:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:56.831938 | orchestrator | 2025-09-23 22:30:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:56.831967 | orchestrator | 2025-09-23 22:30:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:30:59.872421 | orchestrator | 2025-09-23 22:30:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:30:59.873365 | orchestrator | 2025-09-23 22:30:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:30:59.873563 | orchestrator | 2025-09-23 22:30:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:02.919876 | orchestrator | 2025-09-23 22:31:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:02.922265 | orchestrator | 2025-09-23 22:31:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:02.922309 | orchestrator | 2025-09-23 22:31:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:05.968960 | orchestrator | 2025-09-23 22:31:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:05.969690 | orchestrator | 2025-09-23 22:31:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:05.969774 | orchestrator | 2025-09-23 22:31:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:09.017471 | orchestrator | 2025-09-23 22:31:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:09.018581 | orchestrator | 2025-09-23 22:31:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:09.018615 | orchestrator | 2025-09-23 22:31:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:12.068127 | orchestrator | 2025-09-23 22:31:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:12.068737 | orchestrator | 2025-09-23 22:31:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:12.068767 | orchestrator | 2025-09-23 22:31:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:15.113571 | orchestrator | 2025-09-23 22:31:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:15.116112 | orchestrator | 2025-09-23 22:31:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:15.116287 | orchestrator | 2025-09-23 22:31:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:18.158361 | orchestrator | 2025-09-23 22:31:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:18.160153 | orchestrator | 2025-09-23 22:31:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:18.160188 | orchestrator | 2025-09-23 22:31:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:21.211517 | orchestrator | 2025-09-23 22:31:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:21.211954 | orchestrator | 2025-09-23 22:31:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:21.212174 | orchestrator | 2025-09-23 22:31:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:24.257379 | orchestrator | 2025-09-23 22:31:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:24.258871 | orchestrator | 2025-09-23 22:31:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:24.258920 | orchestrator | 2025-09-23 22:31:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:27.299807 | orchestrator | 2025-09-23 22:31:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:27.300936 | orchestrator | 2025-09-23 22:31:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:27.300971 | orchestrator | 2025-09-23 22:31:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:30.341614 | orchestrator | 2025-09-23 22:31:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:30.343288 | orchestrator | 2025-09-23 22:31:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:30.343503 | orchestrator | 2025-09-23 22:31:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:33.389866 | orchestrator | 2025-09-23 22:31:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:33.391388 | orchestrator | 2025-09-23 22:31:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:33.391433 | orchestrator | 2025-09-23 22:31:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:36.437621 | orchestrator | 2025-09-23 22:31:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:36.438786 | orchestrator | 2025-09-23 22:31:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:36.438930 | orchestrator | 2025-09-23 22:31:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:39.481176 | orchestrator | 2025-09-23 22:31:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:39.482630 | orchestrator | 2025-09-23 22:31:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:39.482706 | orchestrator | 2025-09-23 22:31:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:42.529398 | orchestrator | 2025-09-23 22:31:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:42.530526 | orchestrator | 2025-09-23 22:31:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:42.530615 | orchestrator | 2025-09-23 22:31:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:45.577181 | orchestrator | 2025-09-23 22:31:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:45.580186 | orchestrator | 2025-09-23 22:31:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:45.580227 | orchestrator | 2025-09-23 22:31:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:48.623798 | orchestrator | 2025-09-23 22:31:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:48.624644 | orchestrator | 2025-09-23 22:31:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:48.624719 | orchestrator | 2025-09-23 22:31:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:51.670305 | orchestrator | 2025-09-23 22:31:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:51.672425 | orchestrator | 2025-09-23 22:31:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:51.672524 | orchestrator | 2025-09-23 22:31:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:54.721162 | orchestrator | 2025-09-23 22:31:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:54.722800 | orchestrator | 2025-09-23 22:31:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:54.722880 | orchestrator | 2025-09-23 22:31:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:31:57.770426 | orchestrator | 2025-09-23 22:31:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:31:57.771135 | orchestrator | 2025-09-23 22:31:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:31:57.771164 | orchestrator | 2025-09-23 22:31:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:00.815549 | orchestrator | 2025-09-23 22:32:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:00.816341 | orchestrator | 2025-09-23 22:32:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:00.816434 | orchestrator | 2025-09-23 22:32:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:03.864392 | orchestrator | 2025-09-23 22:32:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:03.865560 | orchestrator | 2025-09-23 22:32:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:03.865956 | orchestrator | 2025-09-23 22:32:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:06.917096 | orchestrator | 2025-09-23 22:32:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:06.918894 | orchestrator | 2025-09-23 22:32:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:06.918960 | orchestrator | 2025-09-23 22:32:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:09.959838 | orchestrator | 2025-09-23 22:32:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:09.960551 | orchestrator | 2025-09-23 22:32:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:09.960582 | orchestrator | 2025-09-23 22:32:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:13.007269 | orchestrator | 2025-09-23 22:32:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:13.008460 | orchestrator | 2025-09-23 22:32:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:13.008491 | orchestrator | 2025-09-23 22:32:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:16.054466 | orchestrator | 2025-09-23 22:32:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:16.055754 | orchestrator | 2025-09-23 22:32:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:16.055851 | orchestrator | 2025-09-23 22:32:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:19.095564 | orchestrator | 2025-09-23 22:32:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:19.096051 | orchestrator | 2025-09-23 22:32:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:19.096112 | orchestrator | 2025-09-23 22:32:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:22.141374 | orchestrator | 2025-09-23 22:32:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:22.144231 | orchestrator | 2025-09-23 22:32:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:22.144333 | orchestrator | 2025-09-23 22:32:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:25.187149 | orchestrator | 2025-09-23 22:32:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:25.188171 | orchestrator | 2025-09-23 22:32:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:25.188298 | orchestrator | 2025-09-23 22:32:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:28.228687 | orchestrator | 2025-09-23 22:32:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:28.230386 | orchestrator | 2025-09-23 22:32:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:28.230422 | orchestrator | 2025-09-23 22:32:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:31.272369 | orchestrator | 2025-09-23 22:32:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:31.275054 | orchestrator | 2025-09-23 22:32:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:31.275176 | orchestrator | 2025-09-23 22:32:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:34.321472 | orchestrator | 2025-09-23 22:32:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:34.323200 | orchestrator | 2025-09-23 22:32:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:34.323341 | orchestrator | 2025-09-23 22:32:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:37.366405 | orchestrator | 2025-09-23 22:32:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:37.368142 | orchestrator | 2025-09-23 22:32:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:37.368165 | orchestrator | 2025-09-23 22:32:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:40.418304 | orchestrator | 2025-09-23 22:32:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:40.420410 | orchestrator | 2025-09-23 22:32:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:40.420555 | orchestrator | 2025-09-23 22:32:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:43.466128 | orchestrator | 2025-09-23 22:32:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:43.467851 | orchestrator | 2025-09-23 22:32:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:43.467992 | orchestrator | 2025-09-23 22:32:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:46.511632 | orchestrator | 2025-09-23 22:32:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:46.512989 | orchestrator | 2025-09-23 22:32:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:46.513007 | orchestrator | 2025-09-23 22:32:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:49.554832 | orchestrator | 2025-09-23 22:32:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:49.556836 | orchestrator | 2025-09-23 22:32:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:49.556898 | orchestrator | 2025-09-23 22:32:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:52.602219 | orchestrator | 2025-09-23 22:32:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:52.603118 | orchestrator | 2025-09-23 22:32:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:52.603157 | orchestrator | 2025-09-23 22:32:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:55.651668 | orchestrator | 2025-09-23 22:32:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:55.653458 | orchestrator | 2025-09-23 22:32:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:55.653521 | orchestrator | 2025-09-23 22:32:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:32:58.698680 | orchestrator | 2025-09-23 22:32:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:32:58.700978 | orchestrator | 2025-09-23 22:32:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:32:58.701035 | orchestrator | 2025-09-23 22:32:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:01.746212 | orchestrator | 2025-09-23 22:33:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:01.747788 | orchestrator | 2025-09-23 22:33:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:01.747973 | orchestrator | 2025-09-23 22:33:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:04.793760 | orchestrator | 2025-09-23 22:33:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:04.795666 | orchestrator | 2025-09-23 22:33:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:04.796050 | orchestrator | 2025-09-23 22:33:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:07.838688 | orchestrator | 2025-09-23 22:33:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:07.841115 | orchestrator | 2025-09-23 22:33:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:07.841202 | orchestrator | 2025-09-23 22:33:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:10.882269 | orchestrator | 2025-09-23 22:33:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:10.883859 | orchestrator | 2025-09-23 22:33:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:10.883892 | orchestrator | 2025-09-23 22:33:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:13.938425 | orchestrator | 2025-09-23 22:33:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:13.939807 | orchestrator | 2025-09-23 22:33:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:13.939837 | orchestrator | 2025-09-23 22:33:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:16.987923 | orchestrator | 2025-09-23 22:33:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:16.989310 | orchestrator | 2025-09-23 22:33:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:16.989342 | orchestrator | 2025-09-23 22:33:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:20.041376 | orchestrator | 2025-09-23 22:33:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:20.042673 | orchestrator | 2025-09-23 22:33:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:20.042703 | orchestrator | 2025-09-23 22:33:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:23.088651 | orchestrator | 2025-09-23 22:33:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:23.091059 | orchestrator | 2025-09-23 22:33:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:23.091143 | orchestrator | 2025-09-23 22:33:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:26.133783 | orchestrator | 2025-09-23 22:33:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:26.135336 | orchestrator | 2025-09-23 22:33:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:26.135451 | orchestrator | 2025-09-23 22:33:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:29.183802 | orchestrator | 2025-09-23 22:33:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:29.184848 | orchestrator | 2025-09-23 22:33:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:29.184916 | orchestrator | 2025-09-23 22:33:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:32.230169 | orchestrator | 2025-09-23 22:33:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:32.231222 | orchestrator | 2025-09-23 22:33:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:32.233011 | orchestrator | 2025-09-23 22:33:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:35.275278 | orchestrator | 2025-09-23 22:33:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:35.277332 | orchestrator | 2025-09-23 22:33:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:35.277385 | orchestrator | 2025-09-23 22:33:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:38.325149 | orchestrator | 2025-09-23 22:33:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:38.326537 | orchestrator | 2025-09-23 22:33:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:38.326593 | orchestrator | 2025-09-23 22:33:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:41.372782 | orchestrator | 2025-09-23 22:33:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:41.374869 | orchestrator | 2025-09-23 22:33:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:41.374910 | orchestrator | 2025-09-23 22:33:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:44.426808 | orchestrator | 2025-09-23 22:33:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:44.428902 | orchestrator | 2025-09-23 22:33:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:44.428938 | orchestrator | 2025-09-23 22:33:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:47.468971 | orchestrator | 2025-09-23 22:33:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:47.470638 | orchestrator | 2025-09-23 22:33:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:47.470688 | orchestrator | 2025-09-23 22:33:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:50.510308 | orchestrator | 2025-09-23 22:33:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:50.510421 | orchestrator | 2025-09-23 22:33:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:50.510561 | orchestrator | 2025-09-23 22:33:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:53.555782 | orchestrator | 2025-09-23 22:33:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:53.557485 | orchestrator | 2025-09-23 22:33:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:53.557532 | orchestrator | 2025-09-23 22:33:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:56.597591 | orchestrator | 2025-09-23 22:33:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:56.599819 | orchestrator | 2025-09-23 22:33:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:56.600615 | orchestrator | 2025-09-23 22:33:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:33:59.645716 | orchestrator | 2025-09-23 22:33:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:33:59.647195 | orchestrator | 2025-09-23 22:33:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:33:59.647266 | orchestrator | 2025-09-23 22:33:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:02.691529 | orchestrator | 2025-09-23 22:34:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:02.694827 | orchestrator | 2025-09-23 22:34:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:02.694862 | orchestrator | 2025-09-23 22:34:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:05.740468 | orchestrator | 2025-09-23 22:34:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:05.741353 | orchestrator | 2025-09-23 22:34:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:05.741639 | orchestrator | 2025-09-23 22:34:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:08.788604 | orchestrator | 2025-09-23 22:34:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:08.790388 | orchestrator | 2025-09-23 22:34:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:08.790438 | orchestrator | 2025-09-23 22:34:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:11.836783 | orchestrator | 2025-09-23 22:34:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:11.838567 | orchestrator | 2025-09-23 22:34:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:11.838665 | orchestrator | 2025-09-23 22:34:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:14.887371 | orchestrator | 2025-09-23 22:34:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:14.888383 | orchestrator | 2025-09-23 22:34:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:14.888469 | orchestrator | 2025-09-23 22:34:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:17.928782 | orchestrator | 2025-09-23 22:34:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:17.931194 | orchestrator | 2025-09-23 22:34:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:17.931362 | orchestrator | 2025-09-23 22:34:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:20.974562 | orchestrator | 2025-09-23 22:34:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:20.975546 | orchestrator | 2025-09-23 22:34:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:20.975587 | orchestrator | 2025-09-23 22:34:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:24.018813 | orchestrator | 2025-09-23 22:34:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:24.020211 | orchestrator | 2025-09-23 22:34:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:24.020251 | orchestrator | 2025-09-23 22:34:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:27.065056 | orchestrator | 2025-09-23 22:34:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:27.066213 | orchestrator | 2025-09-23 22:34:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:27.066334 | orchestrator | 2025-09-23 22:34:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:30.113567 | orchestrator | 2025-09-23 22:34:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:30.114749 | orchestrator | 2025-09-23 22:34:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:30.114785 | orchestrator | 2025-09-23 22:34:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:33.154979 | orchestrator | 2025-09-23 22:34:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:33.155317 | orchestrator | 2025-09-23 22:34:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:33.155542 | orchestrator | 2025-09-23 22:34:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:36.199075 | orchestrator | 2025-09-23 22:34:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:36.199727 | orchestrator | 2025-09-23 22:34:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:36.199761 | orchestrator | 2025-09-23 22:34:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:39.241028 | orchestrator | 2025-09-23 22:34:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:39.242734 | orchestrator | 2025-09-23 22:34:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:39.242887 | orchestrator | 2025-09-23 22:34:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:42.288477 | orchestrator | 2025-09-23 22:34:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:42.290532 | orchestrator | 2025-09-23 22:34:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:42.290665 | orchestrator | 2025-09-23 22:34:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:45.338367 | orchestrator | 2025-09-23 22:34:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:45.341161 | orchestrator | 2025-09-23 22:34:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:45.341311 | orchestrator | 2025-09-23 22:34:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:48.382799 | orchestrator | 2025-09-23 22:34:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:48.384731 | orchestrator | 2025-09-23 22:34:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:48.384893 | orchestrator | 2025-09-23 22:34:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:51.430579 | orchestrator | 2025-09-23 22:34:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:51.432546 | orchestrator | 2025-09-23 22:34:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:51.432588 | orchestrator | 2025-09-23 22:34:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:54.478838 | orchestrator | 2025-09-23 22:34:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:54.480477 | orchestrator | 2025-09-23 22:34:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:54.480521 | orchestrator | 2025-09-23 22:34:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:34:57.523850 | orchestrator | 2025-09-23 22:34:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:34:57.525318 | orchestrator | 2025-09-23 22:34:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:34:57.525354 | orchestrator | 2025-09-23 22:34:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:00.575467 | orchestrator | 2025-09-23 22:35:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:00.576797 | orchestrator | 2025-09-23 22:35:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:00.576969 | orchestrator | 2025-09-23 22:35:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:03.621390 | orchestrator | 2025-09-23 22:35:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:03.622584 | orchestrator | 2025-09-23 22:35:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:03.622807 | orchestrator | 2025-09-23 22:35:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:06.669329 | orchestrator | 2025-09-23 22:35:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:06.670301 | orchestrator | 2025-09-23 22:35:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:06.670335 | orchestrator | 2025-09-23 22:35:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:09.710219 | orchestrator | 2025-09-23 22:35:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:09.711566 | orchestrator | 2025-09-23 22:35:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:09.711587 | orchestrator | 2025-09-23 22:35:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:12.753386 | orchestrator | 2025-09-23 22:35:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:12.754964 | orchestrator | 2025-09-23 22:35:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:12.755009 | orchestrator | 2025-09-23 22:35:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:15.798807 | orchestrator | 2025-09-23 22:35:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:15.801032 | orchestrator | 2025-09-23 22:35:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:15.801065 | orchestrator | 2025-09-23 22:35:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:18.837164 | orchestrator | 2025-09-23 22:35:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:18.839449 | orchestrator | 2025-09-23 22:35:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:18.839908 | orchestrator | 2025-09-23 22:35:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:21.882585 | orchestrator | 2025-09-23 22:35:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:21.884269 | orchestrator | 2025-09-23 22:35:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:21.884359 | orchestrator | 2025-09-23 22:35:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:24.931610 | orchestrator | 2025-09-23 22:35:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:24.932726 | orchestrator | 2025-09-23 22:35:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:24.932950 | orchestrator | 2025-09-23 22:35:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:27.977468 | orchestrator | 2025-09-23 22:35:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:27.980276 | orchestrator | 2025-09-23 22:35:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:27.980314 | orchestrator | 2025-09-23 22:35:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:31.024592 | orchestrator | 2025-09-23 22:35:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:31.025552 | orchestrator | 2025-09-23 22:35:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:31.025586 | orchestrator | 2025-09-23 22:35:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:34.072176 | orchestrator | 2025-09-23 22:35:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:34.073549 | orchestrator | 2025-09-23 22:35:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:34.074492 | orchestrator | 2025-09-23 22:35:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:37.116566 | orchestrator | 2025-09-23 22:35:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:37.118198 | orchestrator | 2025-09-23 22:35:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:37.119039 | orchestrator | 2025-09-23 22:35:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:40.158867 | orchestrator | 2025-09-23 22:35:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:40.160187 | orchestrator | 2025-09-23 22:35:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:40.160280 | orchestrator | 2025-09-23 22:35:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:43.206878 | orchestrator | 2025-09-23 22:35:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:43.209018 | orchestrator | 2025-09-23 22:35:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:43.209070 | orchestrator | 2025-09-23 22:35:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:46.258897 | orchestrator | 2025-09-23 22:35:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:46.260983 | orchestrator | 2025-09-23 22:35:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:46.261096 | orchestrator | 2025-09-23 22:35:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:49.302001 | orchestrator | 2025-09-23 22:35:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:49.302459 | orchestrator | 2025-09-23 22:35:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:49.302753 | orchestrator | 2025-09-23 22:35:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:52.345449 | orchestrator | 2025-09-23 22:35:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:52.347236 | orchestrator | 2025-09-23 22:35:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:52.347271 | orchestrator | 2025-09-23 22:35:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:55.390630 | orchestrator | 2025-09-23 22:35:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:55.392348 | orchestrator | 2025-09-23 22:35:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:55.392422 | orchestrator | 2025-09-23 22:35:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:35:58.435335 | orchestrator | 2025-09-23 22:35:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:35:58.436897 | orchestrator | 2025-09-23 22:35:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:35:58.437528 | orchestrator | 2025-09-23 22:35:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:01.478739 | orchestrator | 2025-09-23 22:36:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:01.480960 | orchestrator | 2025-09-23 22:36:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:01.481043 | orchestrator | 2025-09-23 22:36:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:04.531191 | orchestrator | 2025-09-23 22:36:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:04.532198 | orchestrator | 2025-09-23 22:36:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:04.532223 | orchestrator | 2025-09-23 22:36:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:07.578428 | orchestrator | 2025-09-23 22:36:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:07.579786 | orchestrator | 2025-09-23 22:36:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:07.579818 | orchestrator | 2025-09-23 22:36:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:10.624845 | orchestrator | 2025-09-23 22:36:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:10.626602 | orchestrator | 2025-09-23 22:36:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:10.626686 | orchestrator | 2025-09-23 22:36:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:13.673108 | orchestrator | 2025-09-23 22:36:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:13.676157 | orchestrator | 2025-09-23 22:36:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:13.676189 | orchestrator | 2025-09-23 22:36:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:16.724000 | orchestrator | 2025-09-23 22:36:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:16.725809 | orchestrator | 2025-09-23 22:36:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:16.725838 | orchestrator | 2025-09-23 22:36:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:19.771019 | orchestrator | 2025-09-23 22:36:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:19.772084 | orchestrator | 2025-09-23 22:36:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:19.772343 | orchestrator | 2025-09-23 22:36:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:22.813548 | orchestrator | 2025-09-23 22:36:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:22.814173 | orchestrator | 2025-09-23 22:36:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:22.814299 | orchestrator | 2025-09-23 22:36:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:25.861793 | orchestrator | 2025-09-23 22:36:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:25.863634 | orchestrator | 2025-09-23 22:36:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:25.863655 | orchestrator | 2025-09-23 22:36:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:28.911449 | orchestrator | 2025-09-23 22:36:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:28.913218 | orchestrator | 2025-09-23 22:36:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:28.913345 | orchestrator | 2025-09-23 22:36:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:31.956401 | orchestrator | 2025-09-23 22:36:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:31.958335 | orchestrator | 2025-09-23 22:36:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:31.958412 | orchestrator | 2025-09-23 22:36:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:35.005247 | orchestrator | 2025-09-23 22:36:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:35.007260 | orchestrator | 2025-09-23 22:36:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:35.007300 | orchestrator | 2025-09-23 22:36:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:38.054937 | orchestrator | 2025-09-23 22:36:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:38.057050 | orchestrator | 2025-09-23 22:36:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:38.057088 | orchestrator | 2025-09-23 22:36:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:41.097766 | orchestrator | 2025-09-23 22:36:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:41.099879 | orchestrator | 2025-09-23 22:36:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:41.100034 | orchestrator | 2025-09-23 22:36:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:44.142241 | orchestrator | 2025-09-23 22:36:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:44.144376 | orchestrator | 2025-09-23 22:36:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:44.144533 | orchestrator | 2025-09-23 22:36:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:47.185063 | orchestrator | 2025-09-23 22:36:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:47.185415 | orchestrator | 2025-09-23 22:36:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:47.185433 | orchestrator | 2025-09-23 22:36:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:50.230499 | orchestrator | 2025-09-23 22:36:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:50.232608 | orchestrator | 2025-09-23 22:36:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:50.232888 | orchestrator | 2025-09-23 22:36:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:53.274945 | orchestrator | 2025-09-23 22:36:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:53.275899 | orchestrator | 2025-09-23 22:36:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:53.275941 | orchestrator | 2025-09-23 22:36:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:56.321582 | orchestrator | 2025-09-23 22:36:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:56.323328 | orchestrator | 2025-09-23 22:36:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:56.323367 | orchestrator | 2025-09-23 22:36:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:36:59.371821 | orchestrator | 2025-09-23 22:36:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:36:59.373253 | orchestrator | 2025-09-23 22:36:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:36:59.373460 | orchestrator | 2025-09-23 22:36:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:02.426129 | orchestrator | 2025-09-23 22:37:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:02.428108 | orchestrator | 2025-09-23 22:37:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:02.428182 | orchestrator | 2025-09-23 22:37:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:05.476588 | orchestrator | 2025-09-23 22:37:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:05.483206 | orchestrator | 2025-09-23 22:37:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:05.483258 | orchestrator | 2025-09-23 22:37:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:08.528631 | orchestrator | 2025-09-23 22:37:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:08.530272 | orchestrator | 2025-09-23 22:37:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:08.530382 | orchestrator | 2025-09-23 22:37:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:11.579854 | orchestrator | 2025-09-23 22:37:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:11.580929 | orchestrator | 2025-09-23 22:37:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:11.581115 | orchestrator | 2025-09-23 22:37:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:14.629068 | orchestrator | 2025-09-23 22:37:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:14.631075 | orchestrator | 2025-09-23 22:37:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:14.631187 | orchestrator | 2025-09-23 22:37:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:17.678287 | orchestrator | 2025-09-23 22:37:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:17.681054 | orchestrator | 2025-09-23 22:37:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:17.681095 | orchestrator | 2025-09-23 22:37:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:20.731463 | orchestrator | 2025-09-23 22:37:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:20.733424 | orchestrator | 2025-09-23 22:37:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:20.733566 | orchestrator | 2025-09-23 22:37:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:23.778923 | orchestrator | 2025-09-23 22:37:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:23.780811 | orchestrator | 2025-09-23 22:37:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:23.780854 | orchestrator | 2025-09-23 22:37:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:26.825480 | orchestrator | 2025-09-23 22:37:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:26.826553 | orchestrator | 2025-09-23 22:37:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:26.826589 | orchestrator | 2025-09-23 22:37:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:29.870339 | orchestrator | 2025-09-23 22:37:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:29.871425 | orchestrator | 2025-09-23 22:37:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:29.871455 | orchestrator | 2025-09-23 22:37:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:32.915031 | orchestrator | 2025-09-23 22:37:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:32.916519 | orchestrator | 2025-09-23 22:37:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:32.916541 | orchestrator | 2025-09-23 22:37:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:35.961080 | orchestrator | 2025-09-23 22:37:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:35.963555 | orchestrator | 2025-09-23 22:37:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:35.963916 | orchestrator | 2025-09-23 22:37:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:39.008781 | orchestrator | 2025-09-23 22:37:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:39.010682 | orchestrator | 2025-09-23 22:37:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:39.010995 | orchestrator | 2025-09-23 22:37:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:42.055862 | orchestrator | 2025-09-23 22:37:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:42.057906 | orchestrator | 2025-09-23 22:37:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:42.058133 | orchestrator | 2025-09-23 22:37:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:45.103149 | orchestrator | 2025-09-23 22:37:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:45.105940 | orchestrator | 2025-09-23 22:37:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:45.106062 | orchestrator | 2025-09-23 22:37:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:48.149208 | orchestrator | 2025-09-23 22:37:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:48.150213 | orchestrator | 2025-09-23 22:37:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:48.150254 | orchestrator | 2025-09-23 22:37:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:51.194083 | orchestrator | 2025-09-23 22:37:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:51.195333 | orchestrator | 2025-09-23 22:37:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:51.195996 | orchestrator | 2025-09-23 22:37:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:54.238864 | orchestrator | 2025-09-23 22:37:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:54.240445 | orchestrator | 2025-09-23 22:37:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:54.240462 | orchestrator | 2025-09-23 22:37:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:37:57.284469 | orchestrator | 2025-09-23 22:37:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:37:57.287489 | orchestrator | 2025-09-23 22:37:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:37:57.287540 | orchestrator | 2025-09-23 22:37:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:00.330524 | orchestrator | 2025-09-23 22:38:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:00.331576 | orchestrator | 2025-09-23 22:38:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:00.331647 | orchestrator | 2025-09-23 22:38:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:03.375591 | orchestrator | 2025-09-23 22:38:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:03.375952 | orchestrator | 2025-09-23 22:38:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:03.376119 | orchestrator | 2025-09-23 22:38:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:06.420763 | orchestrator | 2025-09-23 22:38:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:06.422563 | orchestrator | 2025-09-23 22:38:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:06.422722 | orchestrator | 2025-09-23 22:38:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:09.469552 | orchestrator | 2025-09-23 22:38:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:09.470681 | orchestrator | 2025-09-23 22:38:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:09.470984 | orchestrator | 2025-09-23 22:38:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:12.519474 | orchestrator | 2025-09-23 22:38:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:12.520893 | orchestrator | 2025-09-23 22:38:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:12.520935 | orchestrator | 2025-09-23 22:38:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:15.569249 | orchestrator | 2025-09-23 22:38:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:15.570008 | orchestrator | 2025-09-23 22:38:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:15.570092 | orchestrator | 2025-09-23 22:38:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:18.611859 | orchestrator | 2025-09-23 22:38:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:18.613572 | orchestrator | 2025-09-23 22:38:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:18.613603 | orchestrator | 2025-09-23 22:38:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:21.660474 | orchestrator | 2025-09-23 22:38:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:21.662842 | orchestrator | 2025-09-23 22:38:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:21.662883 | orchestrator | 2025-09-23 22:38:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:24.709384 | orchestrator | 2025-09-23 22:38:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:24.710785 | orchestrator | 2025-09-23 22:38:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:24.710890 | orchestrator | 2025-09-23 22:38:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:27.757033 | orchestrator | 2025-09-23 22:38:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:27.758471 | orchestrator | 2025-09-23 22:38:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:27.758594 | orchestrator | 2025-09-23 22:38:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:30.806533 | orchestrator | 2025-09-23 22:38:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:30.808462 | orchestrator | 2025-09-23 22:38:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:30.808498 | orchestrator | 2025-09-23 22:38:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:33.850405 | orchestrator | 2025-09-23 22:38:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:33.851158 | orchestrator | 2025-09-23 22:38:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:33.851219 | orchestrator | 2025-09-23 22:38:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:36.897434 | orchestrator | 2025-09-23 22:38:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:36.899582 | orchestrator | 2025-09-23 22:38:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:36.899669 | orchestrator | 2025-09-23 22:38:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:39.942328 | orchestrator | 2025-09-23 22:38:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:39.944593 | orchestrator | 2025-09-23 22:38:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:39.944734 | orchestrator | 2025-09-23 22:38:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:42.987309 | orchestrator | 2025-09-23 22:38:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:42.989068 | orchestrator | 2025-09-23 22:38:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:42.989147 | orchestrator | 2025-09-23 22:38:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:46.043039 | orchestrator | 2025-09-23 22:38:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:46.045714 | orchestrator | 2025-09-23 22:38:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:46.045753 | orchestrator | 2025-09-23 22:38:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:49.095276 | orchestrator | 2025-09-23 22:38:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:49.096888 | orchestrator | 2025-09-23 22:38:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:49.096928 | orchestrator | 2025-09-23 22:38:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:52.149798 | orchestrator | 2025-09-23 22:38:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:52.150866 | orchestrator | 2025-09-23 22:38:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:52.151203 | orchestrator | 2025-09-23 22:38:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:55.196784 | orchestrator | 2025-09-23 22:38:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:55.198836 | orchestrator | 2025-09-23 22:38:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:55.198962 | orchestrator | 2025-09-23 22:38:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:38:58.250351 | orchestrator | 2025-09-23 22:38:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:38:58.251765 | orchestrator | 2025-09-23 22:38:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:38:58.251813 | orchestrator | 2025-09-23 22:38:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:01.299622 | orchestrator | 2025-09-23 22:39:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:01.300940 | orchestrator | 2025-09-23 22:39:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:01.301284 | orchestrator | 2025-09-23 22:39:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:04.344942 | orchestrator | 2025-09-23 22:39:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:04.345435 | orchestrator | 2025-09-23 22:39:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:04.345472 | orchestrator | 2025-09-23 22:39:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:07.395946 | orchestrator | 2025-09-23 22:39:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:07.397332 | orchestrator | 2025-09-23 22:39:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:07.397366 | orchestrator | 2025-09-23 22:39:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:10.443742 | orchestrator | 2025-09-23 22:39:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:10.444999 | orchestrator | 2025-09-23 22:39:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:10.445036 | orchestrator | 2025-09-23 22:39:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:13.488265 | orchestrator | 2025-09-23 22:39:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:13.489453 | orchestrator | 2025-09-23 22:39:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:13.489568 | orchestrator | 2025-09-23 22:39:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:16.533705 | orchestrator | 2025-09-23 22:39:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:16.534663 | orchestrator | 2025-09-23 22:39:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:16.535001 | orchestrator | 2025-09-23 22:39:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:19.578793 | orchestrator | 2025-09-23 22:39:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:19.580570 | orchestrator | 2025-09-23 22:39:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:19.580602 | orchestrator | 2025-09-23 22:39:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:22.627779 | orchestrator | 2025-09-23 22:39:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:22.629569 | orchestrator | 2025-09-23 22:39:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:22.629750 | orchestrator | 2025-09-23 22:39:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:25.680018 | orchestrator | 2025-09-23 22:39:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:25.681854 | orchestrator | 2025-09-23 22:39:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:25.681906 | orchestrator | 2025-09-23 22:39:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:28.727479 | orchestrator | 2025-09-23 22:39:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:28.729530 | orchestrator | 2025-09-23 22:39:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:28.729570 | orchestrator | 2025-09-23 22:39:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:31.773336 | orchestrator | 2025-09-23 22:39:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:31.775368 | orchestrator | 2025-09-23 22:39:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:31.775577 | orchestrator | 2025-09-23 22:39:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:34.820748 | orchestrator | 2025-09-23 22:39:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:34.823039 | orchestrator | 2025-09-23 22:39:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:34.823077 | orchestrator | 2025-09-23 22:39:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:37.867205 | orchestrator | 2025-09-23 22:39:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:37.868509 | orchestrator | 2025-09-23 22:39:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:37.868542 | orchestrator | 2025-09-23 22:39:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:40.911759 | orchestrator | 2025-09-23 22:39:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:40.913545 | orchestrator | 2025-09-23 22:39:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:40.913866 | orchestrator | 2025-09-23 22:39:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:43.959499 | orchestrator | 2025-09-23 22:39:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:43.962537 | orchestrator | 2025-09-23 22:39:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:43.962571 | orchestrator | 2025-09-23 22:39:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:47.010814 | orchestrator | 2025-09-23 22:39:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:47.012348 | orchestrator | 2025-09-23 22:39:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:47.012451 | orchestrator | 2025-09-23 22:39:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:50.062109 | orchestrator | 2025-09-23 22:39:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:50.064253 | orchestrator | 2025-09-23 22:39:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:50.064287 | orchestrator | 2025-09-23 22:39:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:53.116440 | orchestrator | 2025-09-23 22:39:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:53.117458 | orchestrator | 2025-09-23 22:39:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:53.117505 | orchestrator | 2025-09-23 22:39:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:56.158149 | orchestrator | 2025-09-23 22:39:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:56.161791 | orchestrator | 2025-09-23 22:39:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:56.161834 | orchestrator | 2025-09-23 22:39:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:39:59.205243 | orchestrator | 2025-09-23 22:39:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:39:59.206323 | orchestrator | 2025-09-23 22:39:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:39:59.206353 | orchestrator | 2025-09-23 22:39:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:02.252304 | orchestrator | 2025-09-23 22:40:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:02.254953 | orchestrator | 2025-09-23 22:40:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:02.255034 | orchestrator | 2025-09-23 22:40:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:05.297862 | orchestrator | 2025-09-23 22:40:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:05.300129 | orchestrator | 2025-09-23 22:40:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:05.300285 | orchestrator | 2025-09-23 22:40:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:08.344255 | orchestrator | 2025-09-23 22:40:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:08.345951 | orchestrator | 2025-09-23 22:40:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:08.346101 | orchestrator | 2025-09-23 22:40:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:11.396115 | orchestrator | 2025-09-23 22:40:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:11.398399 | orchestrator | 2025-09-23 22:40:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:11.398481 | orchestrator | 2025-09-23 22:40:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:14.445496 | orchestrator | 2025-09-23 22:40:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:14.446449 | orchestrator | 2025-09-23 22:40:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:14.446494 | orchestrator | 2025-09-23 22:40:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:17.490453 | orchestrator | 2025-09-23 22:40:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:17.491891 | orchestrator | 2025-09-23 22:40:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:17.491926 | orchestrator | 2025-09-23 22:40:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:20.536982 | orchestrator | 2025-09-23 22:40:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:20.537913 | orchestrator | 2025-09-23 22:40:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:20.537946 | orchestrator | 2025-09-23 22:40:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:23.585339 | orchestrator | 2025-09-23 22:40:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:23.587104 | orchestrator | 2025-09-23 22:40:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:23.587143 | orchestrator | 2025-09-23 22:40:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:26.627712 | orchestrator | 2025-09-23 22:40:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:26.628985 | orchestrator | 2025-09-23 22:40:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:26.629017 | orchestrator | 2025-09-23 22:40:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:29.677690 | orchestrator | 2025-09-23 22:40:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:29.680050 | orchestrator | 2025-09-23 22:40:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:29.680084 | orchestrator | 2025-09-23 22:40:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:32.727560 | orchestrator | 2025-09-23 22:40:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:32.728259 | orchestrator | 2025-09-23 22:40:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:32.728322 | orchestrator | 2025-09-23 22:40:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:35.777153 | orchestrator | 2025-09-23 22:40:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:35.779798 | orchestrator | 2025-09-23 22:40:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:35.779838 | orchestrator | 2025-09-23 22:40:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:38.824426 | orchestrator | 2025-09-23 22:40:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:38.827635 | orchestrator | 2025-09-23 22:40:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:38.828394 | orchestrator | 2025-09-23 22:40:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:41.883320 | orchestrator | 2025-09-23 22:40:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:41.884651 | orchestrator | 2025-09-23 22:40:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:41.884689 | orchestrator | 2025-09-23 22:40:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:44.930256 | orchestrator | 2025-09-23 22:40:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:44.932037 | orchestrator | 2025-09-23 22:40:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:44.932071 | orchestrator | 2025-09-23 22:40:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:47.982176 | orchestrator | 2025-09-23 22:40:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:47.983074 | orchestrator | 2025-09-23 22:40:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:47.983142 | orchestrator | 2025-09-23 22:40:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:51.026232 | orchestrator | 2025-09-23 22:40:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:51.027249 | orchestrator | 2025-09-23 22:40:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:51.027281 | orchestrator | 2025-09-23 22:40:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:54.072447 | orchestrator | 2025-09-23 22:40:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:54.074339 | orchestrator | 2025-09-23 22:40:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:54.074380 | orchestrator | 2025-09-23 22:40:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:40:57.121984 | orchestrator | 2025-09-23 22:40:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:40:57.124371 | orchestrator | 2025-09-23 22:40:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:40:57.124709 | orchestrator | 2025-09-23 22:40:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:00.169010 | orchestrator | 2025-09-23 22:41:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:00.170312 | orchestrator | 2025-09-23 22:41:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:00.170348 | orchestrator | 2025-09-23 22:41:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:03.214729 | orchestrator | 2025-09-23 22:41:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:03.217565 | orchestrator | 2025-09-23 22:41:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:03.217764 | orchestrator | 2025-09-23 22:41:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:06.262850 | orchestrator | 2025-09-23 22:41:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:06.265106 | orchestrator | 2025-09-23 22:41:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:06.265307 | orchestrator | 2025-09-23 22:41:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:09.307351 | orchestrator | 2025-09-23 22:41:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:09.308334 | orchestrator | 2025-09-23 22:41:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:09.308633 | orchestrator | 2025-09-23 22:41:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:12.348426 | orchestrator | 2025-09-23 22:41:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:12.350220 | orchestrator | 2025-09-23 22:41:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:12.350252 | orchestrator | 2025-09-23 22:41:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:15.394495 | orchestrator | 2025-09-23 22:41:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:15.395485 | orchestrator | 2025-09-23 22:41:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:15.395538 | orchestrator | 2025-09-23 22:41:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:18.438474 | orchestrator | 2025-09-23 22:41:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:18.440651 | orchestrator | 2025-09-23 22:41:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:18.441109 | orchestrator | 2025-09-23 22:41:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:21.479028 | orchestrator | 2025-09-23 22:41:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:21.480477 | orchestrator | 2025-09-23 22:41:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:21.480508 | orchestrator | 2025-09-23 22:41:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:24.524963 | orchestrator | 2025-09-23 22:41:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:24.525768 | orchestrator | 2025-09-23 22:41:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:24.526845 | orchestrator | 2025-09-23 22:41:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:27.568011 | orchestrator | 2025-09-23 22:41:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:27.570741 | orchestrator | 2025-09-23 22:41:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:27.570956 | orchestrator | 2025-09-23 22:41:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:30.619220 | orchestrator | 2025-09-23 22:41:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:30.621089 | orchestrator | 2025-09-23 22:41:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:30.621180 | orchestrator | 2025-09-23 22:41:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:33.665736 | orchestrator | 2025-09-23 22:41:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:33.667442 | orchestrator | 2025-09-23 22:41:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:33.667594 | orchestrator | 2025-09-23 22:41:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:36.710915 | orchestrator | 2025-09-23 22:41:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:36.712031 | orchestrator | 2025-09-23 22:41:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:36.712739 | orchestrator | 2025-09-23 22:41:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:39.757942 | orchestrator | 2025-09-23 22:41:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:39.759385 | orchestrator | 2025-09-23 22:41:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:39.759678 | orchestrator | 2025-09-23 22:41:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:42.803828 | orchestrator | 2025-09-23 22:41:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:42.806269 | orchestrator | 2025-09-23 22:41:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:42.806306 | orchestrator | 2025-09-23 22:41:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:45.854361 | orchestrator | 2025-09-23 22:41:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:45.855924 | orchestrator | 2025-09-23 22:41:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:45.855955 | orchestrator | 2025-09-23 22:41:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:48.895406 | orchestrator | 2025-09-23 22:41:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:48.895980 | orchestrator | 2025-09-23 22:41:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:48.896013 | orchestrator | 2025-09-23 22:41:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:51.943867 | orchestrator | 2025-09-23 22:41:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:51.945355 | orchestrator | 2025-09-23 22:41:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:51.945393 | orchestrator | 2025-09-23 22:41:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:54.994323 | orchestrator | 2025-09-23 22:41:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:54.995085 | orchestrator | 2025-09-23 22:41:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:54.995119 | orchestrator | 2025-09-23 22:41:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:41:58.048007 | orchestrator | 2025-09-23 22:41:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:41:58.049277 | orchestrator | 2025-09-23 22:41:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:41:58.049312 | orchestrator | 2025-09-23 22:41:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:01.095271 | orchestrator | 2025-09-23 22:42:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:01.096364 | orchestrator | 2025-09-23 22:42:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:01.096438 | orchestrator | 2025-09-23 22:42:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:04.139635 | orchestrator | 2025-09-23 22:42:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:04.141067 | orchestrator | 2025-09-23 22:42:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:04.141348 | orchestrator | 2025-09-23 22:42:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:07.183018 | orchestrator | 2025-09-23 22:42:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:07.185250 | orchestrator | 2025-09-23 22:42:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:07.185403 | orchestrator | 2025-09-23 22:42:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:10.231645 | orchestrator | 2025-09-23 22:42:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:10.233689 | orchestrator | 2025-09-23 22:42:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:10.234154 | orchestrator | 2025-09-23 22:42:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:13.283141 | orchestrator | 2025-09-23 22:42:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:13.284598 | orchestrator | 2025-09-23 22:42:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:13.284633 | orchestrator | 2025-09-23 22:42:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:16.331316 | orchestrator | 2025-09-23 22:42:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:16.333319 | orchestrator | 2025-09-23 22:42:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:16.333362 | orchestrator | 2025-09-23 22:42:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:19.376015 | orchestrator | 2025-09-23 22:42:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:19.377871 | orchestrator | 2025-09-23 22:42:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:19.377921 | orchestrator | 2025-09-23 22:42:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:22.417873 | orchestrator | 2025-09-23 22:42:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:22.418910 | orchestrator | 2025-09-23 22:42:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:22.418961 | orchestrator | 2025-09-23 22:42:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:25.459565 | orchestrator | 2025-09-23 22:42:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:25.461457 | orchestrator | 2025-09-23 22:42:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:25.461502 | orchestrator | 2025-09-23 22:42:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:28.505310 | orchestrator | 2025-09-23 22:42:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:28.507130 | orchestrator | 2025-09-23 22:42:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:28.507265 | orchestrator | 2025-09-23 22:42:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:31.552625 | orchestrator | 2025-09-23 22:42:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:31.554344 | orchestrator | 2025-09-23 22:42:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:31.554380 | orchestrator | 2025-09-23 22:42:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:34.602313 | orchestrator | 2025-09-23 22:42:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:34.603925 | orchestrator | 2025-09-23 22:42:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:34.603982 | orchestrator | 2025-09-23 22:42:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:37.651954 | orchestrator | 2025-09-23 22:42:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:37.653436 | orchestrator | 2025-09-23 22:42:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:37.653486 | orchestrator | 2025-09-23 22:42:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:40.697986 | orchestrator | 2025-09-23 22:42:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:40.699051 | orchestrator | 2025-09-23 22:42:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:40.699087 | orchestrator | 2025-09-23 22:42:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:43.744611 | orchestrator | 2025-09-23 22:42:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:43.744842 | orchestrator | 2025-09-23 22:42:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:43.744945 | orchestrator | 2025-09-23 22:42:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:46.791139 | orchestrator | 2025-09-23 22:42:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:46.792816 | orchestrator | 2025-09-23 22:42:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:46.793165 | orchestrator | 2025-09-23 22:42:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:49.836118 | orchestrator | 2025-09-23 22:42:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:49.837570 | orchestrator | 2025-09-23 22:42:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:49.837861 | orchestrator | 2025-09-23 22:42:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:52.878817 | orchestrator | 2025-09-23 22:42:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:52.881071 | orchestrator | 2025-09-23 22:42:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:52.881113 | orchestrator | 2025-09-23 22:42:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:55.925080 | orchestrator | 2025-09-23 22:42:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:55.926864 | orchestrator | 2025-09-23 22:42:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:55.926934 | orchestrator | 2025-09-23 22:42:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:42:58.979997 | orchestrator | 2025-09-23 22:42:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:42:58.980353 | orchestrator | 2025-09-23 22:42:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:42:58.980837 | orchestrator | 2025-09-23 22:42:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:02.033692 | orchestrator | 2025-09-23 22:43:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:02.034754 | orchestrator | 2025-09-23 22:43:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:02.034833 | orchestrator | 2025-09-23 22:43:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:05.080666 | orchestrator | 2025-09-23 22:43:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:05.082141 | orchestrator | 2025-09-23 22:43:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:05.082177 | orchestrator | 2025-09-23 22:43:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:08.126616 | orchestrator | 2025-09-23 22:43:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:08.127535 | orchestrator | 2025-09-23 22:43:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:08.127755 | orchestrator | 2025-09-23 22:43:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:11.172535 | orchestrator | 2025-09-23 22:43:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:11.174885 | orchestrator | 2025-09-23 22:43:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:11.175310 | orchestrator | 2025-09-23 22:43:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:14.216725 | orchestrator | 2025-09-23 22:43:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:14.218958 | orchestrator | 2025-09-23 22:43:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:14.219005 | orchestrator | 2025-09-23 22:43:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:17.265884 | orchestrator | 2025-09-23 22:43:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:17.267268 | orchestrator | 2025-09-23 22:43:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:17.267304 | orchestrator | 2025-09-23 22:43:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:20.307844 | orchestrator | 2025-09-23 22:43:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:20.309664 | orchestrator | 2025-09-23 22:43:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:20.309699 | orchestrator | 2025-09-23 22:43:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:23.355997 | orchestrator | 2025-09-23 22:43:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:23.357442 | orchestrator | 2025-09-23 22:43:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:23.357876 | orchestrator | 2025-09-23 22:43:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:26.403748 | orchestrator | 2025-09-23 22:43:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:26.406600 | orchestrator | 2025-09-23 22:43:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:26.406665 | orchestrator | 2025-09-23 22:43:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:29.456913 | orchestrator | 2025-09-23 22:43:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:29.457950 | orchestrator | 2025-09-23 22:43:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:29.458003 | orchestrator | 2025-09-23 22:43:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:32.508522 | orchestrator | 2025-09-23 22:43:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:32.509716 | orchestrator | 2025-09-23 22:43:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:32.509745 | orchestrator | 2025-09-23 22:43:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:35.549147 | orchestrator | 2025-09-23 22:43:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:35.550406 | orchestrator | 2025-09-23 22:43:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:35.550454 | orchestrator | 2025-09-23 22:43:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:38.595663 | orchestrator | 2025-09-23 22:43:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:38.597168 | orchestrator | 2025-09-23 22:43:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:38.597268 | orchestrator | 2025-09-23 22:43:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:41.637569 | orchestrator | 2025-09-23 22:43:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:41.640270 | orchestrator | 2025-09-23 22:43:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:41.641037 | orchestrator | 2025-09-23 22:43:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:44.682771 | orchestrator | 2025-09-23 22:43:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:44.683944 | orchestrator | 2025-09-23 22:43:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:44.684028 | orchestrator | 2025-09-23 22:43:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:47.722977 | orchestrator | 2025-09-23 22:43:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:47.724637 | orchestrator | 2025-09-23 22:43:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:47.724709 | orchestrator | 2025-09-23 22:43:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:50.774760 | orchestrator | 2025-09-23 22:43:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:50.777283 | orchestrator | 2025-09-23 22:43:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:50.777334 | orchestrator | 2025-09-23 22:43:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:53.823023 | orchestrator | 2025-09-23 22:43:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:53.825477 | orchestrator | 2025-09-23 22:43:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:53.825535 | orchestrator | 2025-09-23 22:43:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:56.870339 | orchestrator | 2025-09-23 22:43:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:56.871546 | orchestrator | 2025-09-23 22:43:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:56.871718 | orchestrator | 2025-09-23 22:43:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:43:59.914359 | orchestrator | 2025-09-23 22:43:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:43:59.915601 | orchestrator | 2025-09-23 22:43:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:43:59.915640 | orchestrator | 2025-09-23 22:43:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:02.958592 | orchestrator | 2025-09-23 22:44:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:02.960713 | orchestrator | 2025-09-23 22:44:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:02.960788 | orchestrator | 2025-09-23 22:44:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:06.015043 | orchestrator | 2025-09-23 22:44:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:06.015374 | orchestrator | 2025-09-23 22:44:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:06.015514 | orchestrator | 2025-09-23 22:44:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:09.059323 | orchestrator | 2025-09-23 22:44:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:09.060973 | orchestrator | 2025-09-23 22:44:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:09.061210 | orchestrator | 2025-09-23 22:44:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:12.105156 | orchestrator | 2025-09-23 22:44:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:12.105474 | orchestrator | 2025-09-23 22:44:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:12.105501 | orchestrator | 2025-09-23 22:44:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:15.149588 | orchestrator | 2025-09-23 22:44:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:15.150826 | orchestrator | 2025-09-23 22:44:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:15.150987 | orchestrator | 2025-09-23 22:44:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:18.196121 | orchestrator | 2025-09-23 22:44:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:18.197729 | orchestrator | 2025-09-23 22:44:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:18.197762 | orchestrator | 2025-09-23 22:44:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:21.243956 | orchestrator | 2025-09-23 22:44:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:21.245562 | orchestrator | 2025-09-23 22:44:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:21.245848 | orchestrator | 2025-09-23 22:44:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:24.291199 | orchestrator | 2025-09-23 22:44:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:24.292865 | orchestrator | 2025-09-23 22:44:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:24.292926 | orchestrator | 2025-09-23 22:44:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:27.337635 | orchestrator | 2025-09-23 22:44:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:27.340002 | orchestrator | 2025-09-23 22:44:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:27.340067 | orchestrator | 2025-09-23 22:44:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:30.379414 | orchestrator | 2025-09-23 22:44:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:30.380369 | orchestrator | 2025-09-23 22:44:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:30.380495 | orchestrator | 2025-09-23 22:44:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:33.424616 | orchestrator | 2025-09-23 22:44:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:33.426515 | orchestrator | 2025-09-23 22:44:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:33.426569 | orchestrator | 2025-09-23 22:44:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:36.473634 | orchestrator | 2025-09-23 22:44:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:36.474954 | orchestrator | 2025-09-23 22:44:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:36.475150 | orchestrator | 2025-09-23 22:44:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:39.525876 | orchestrator | 2025-09-23 22:44:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:39.527272 | orchestrator | 2025-09-23 22:44:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:39.527334 | orchestrator | 2025-09-23 22:44:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:42.568024 | orchestrator | 2025-09-23 22:44:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:42.568293 | orchestrator | 2025-09-23 22:44:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:42.568733 | orchestrator | 2025-09-23 22:44:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:45.612961 | orchestrator | 2025-09-23 22:44:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:45.614934 | orchestrator | 2025-09-23 22:44:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:45.615143 | orchestrator | 2025-09-23 22:44:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:48.661423 | orchestrator | 2025-09-23 22:44:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:48.662549 | orchestrator | 2025-09-23 22:44:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:48.662591 | orchestrator | 2025-09-23 22:44:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:51.703318 | orchestrator | 2025-09-23 22:44:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:51.704417 | orchestrator | 2025-09-23 22:44:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:51.704567 | orchestrator | 2025-09-23 22:44:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:54.751047 | orchestrator | 2025-09-23 22:44:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:54.837600 | orchestrator | 2025-09-23 22:44:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:54.837667 | orchestrator | 2025-09-23 22:44:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:44:57.795848 | orchestrator | 2025-09-23 22:44:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:44:57.798310 | orchestrator | 2025-09-23 22:44:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:44:57.798353 | orchestrator | 2025-09-23 22:44:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:00.842002 | orchestrator | 2025-09-23 22:45:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:00.844192 | orchestrator | 2025-09-23 22:45:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:00.844304 | orchestrator | 2025-09-23 22:45:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:03.890609 | orchestrator | 2025-09-23 22:45:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:03.891875 | orchestrator | 2025-09-23 22:45:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:03.891951 | orchestrator | 2025-09-23 22:45:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:06.927624 | orchestrator | 2025-09-23 22:45:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:06.929200 | orchestrator | 2025-09-23 22:45:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:06.929368 | orchestrator | 2025-09-23 22:45:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:09.968813 | orchestrator | 2025-09-23 22:45:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:09.972529 | orchestrator | 2025-09-23 22:45:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:09.972660 | orchestrator | 2025-09-23 22:45:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:13.012327 | orchestrator | 2025-09-23 22:45:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:13.014399 | orchestrator | 2025-09-23 22:45:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:13.014747 | orchestrator | 2025-09-23 22:45:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:16.051938 | orchestrator | 2025-09-23 22:45:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:16.053147 | orchestrator | 2025-09-23 22:45:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:16.053181 | orchestrator | 2025-09-23 22:45:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:19.099419 | orchestrator | 2025-09-23 22:45:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:19.101472 | orchestrator | 2025-09-23 22:45:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:19.101514 | orchestrator | 2025-09-23 22:45:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:22.144442 | orchestrator | 2025-09-23 22:45:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:22.146639 | orchestrator | 2025-09-23 22:45:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:22.146695 | orchestrator | 2025-09-23 22:45:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:25.195147 | orchestrator | 2025-09-23 22:45:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:25.196295 | orchestrator | 2025-09-23 22:45:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:25.196413 | orchestrator | 2025-09-23 22:45:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:28.239553 | orchestrator | 2025-09-23 22:45:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:28.241334 | orchestrator | 2025-09-23 22:45:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:28.241450 | orchestrator | 2025-09-23 22:45:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:31.283723 | orchestrator | 2025-09-23 22:45:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:31.285997 | orchestrator | 2025-09-23 22:45:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:31.286215 | orchestrator | 2025-09-23 22:45:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:34.331316 | orchestrator | 2025-09-23 22:45:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:34.333277 | orchestrator | 2025-09-23 22:45:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:34.333406 | orchestrator | 2025-09-23 22:45:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:37.375276 | orchestrator | 2025-09-23 22:45:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:37.377079 | orchestrator | 2025-09-23 22:45:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:37.377192 | orchestrator | 2025-09-23 22:45:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:40.420207 | orchestrator | 2025-09-23 22:45:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:40.421308 | orchestrator | 2025-09-23 22:45:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:40.421351 | orchestrator | 2025-09-23 22:45:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:43.470938 | orchestrator | 2025-09-23 22:45:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:43.472077 | orchestrator | 2025-09-23 22:45:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:43.472111 | orchestrator | 2025-09-23 22:45:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:46.517093 | orchestrator | 2025-09-23 22:45:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:46.518871 | orchestrator | 2025-09-23 22:45:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:46.518910 | orchestrator | 2025-09-23 22:45:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:49.567918 | orchestrator | 2025-09-23 22:45:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:49.569519 | orchestrator | 2025-09-23 22:45:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:49.569560 | orchestrator | 2025-09-23 22:45:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:52.616794 | orchestrator | 2025-09-23 22:45:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:52.619297 | orchestrator | 2025-09-23 22:45:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:52.619692 | orchestrator | 2025-09-23 22:45:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:55.664021 | orchestrator | 2025-09-23 22:45:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:55.666206 | orchestrator | 2025-09-23 22:45:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:55.666278 | orchestrator | 2025-09-23 22:45:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:45:58.708650 | orchestrator | 2025-09-23 22:45:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:45:58.710535 | orchestrator | 2025-09-23 22:45:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:45:58.710628 | orchestrator | 2025-09-23 22:45:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:01.756030 | orchestrator | 2025-09-23 22:46:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:01.757854 | orchestrator | 2025-09-23 22:46:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:01.757892 | orchestrator | 2025-09-23 22:46:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:04.801876 | orchestrator | 2025-09-23 22:46:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:04.803825 | orchestrator | 2025-09-23 22:46:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:04.803860 | orchestrator | 2025-09-23 22:46:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:07.848518 | orchestrator | 2025-09-23 22:46:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:07.850304 | orchestrator | 2025-09-23 22:46:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:07.850362 | orchestrator | 2025-09-23 22:46:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:10.892873 | orchestrator | 2025-09-23 22:46:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:10.894982 | orchestrator | 2025-09-23 22:46:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:10.895022 | orchestrator | 2025-09-23 22:46:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:13.936593 | orchestrator | 2025-09-23 22:46:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:13.938472 | orchestrator | 2025-09-23 22:46:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:13.938509 | orchestrator | 2025-09-23 22:46:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:16.980597 | orchestrator | 2025-09-23 22:46:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:16.983194 | orchestrator | 2025-09-23 22:46:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:16.983330 | orchestrator | 2025-09-23 22:46:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:20.027097 | orchestrator | 2025-09-23 22:46:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:20.028840 | orchestrator | 2025-09-23 22:46:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:20.028876 | orchestrator | 2025-09-23 22:46:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:23.073496 | orchestrator | 2025-09-23 22:46:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:23.075489 | orchestrator | 2025-09-23 22:46:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:23.075689 | orchestrator | 2025-09-23 22:46:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:26.118509 | orchestrator | 2025-09-23 22:46:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:26.120646 | orchestrator | 2025-09-23 22:46:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:26.120746 | orchestrator | 2025-09-23 22:46:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:29.161443 | orchestrator | 2025-09-23 22:46:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:29.163612 | orchestrator | 2025-09-23 22:46:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:29.163661 | orchestrator | 2025-09-23 22:46:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:32.209025 | orchestrator | 2025-09-23 22:46:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:32.210494 | orchestrator | 2025-09-23 22:46:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:32.210528 | orchestrator | 2025-09-23 22:46:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:35.251005 | orchestrator | 2025-09-23 22:46:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:35.252761 | orchestrator | 2025-09-23 22:46:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:35.252933 | orchestrator | 2025-09-23 22:46:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:38.299556 | orchestrator | 2025-09-23 22:46:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:38.300166 | orchestrator | 2025-09-23 22:46:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:38.300198 | orchestrator | 2025-09-23 22:46:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:41.346584 | orchestrator | 2025-09-23 22:46:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:41.347579 | orchestrator | 2025-09-23 22:46:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:41.347613 | orchestrator | 2025-09-23 22:46:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:44.392392 | orchestrator | 2025-09-23 22:46:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:44.392758 | orchestrator | 2025-09-23 22:46:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:44.392791 | orchestrator | 2025-09-23 22:46:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:47.438792 | orchestrator | 2025-09-23 22:46:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:47.440410 | orchestrator | 2025-09-23 22:46:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:47.440439 | orchestrator | 2025-09-23 22:46:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:50.490366 | orchestrator | 2025-09-23 22:46:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:50.491216 | orchestrator | 2025-09-23 22:46:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:50.491278 | orchestrator | 2025-09-23 22:46:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:53.539082 | orchestrator | 2025-09-23 22:46:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:53.540699 | orchestrator | 2025-09-23 22:46:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:53.540740 | orchestrator | 2025-09-23 22:46:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:56.574360 | orchestrator | 2025-09-23 22:46:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:56.575750 | orchestrator | 2025-09-23 22:46:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:56.575826 | orchestrator | 2025-09-23 22:46:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:46:59.622631 | orchestrator | 2025-09-23 22:46:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:46:59.623944 | orchestrator | 2025-09-23 22:46:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:46:59.624037 | orchestrator | 2025-09-23 22:46:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:02.668601 | orchestrator | 2025-09-23 22:47:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:02.670717 | orchestrator | 2025-09-23 22:47:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:02.670782 | orchestrator | 2025-09-23 22:47:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:05.712746 | orchestrator | 2025-09-23 22:47:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:05.714280 | orchestrator | 2025-09-23 22:47:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:05.714322 | orchestrator | 2025-09-23 22:47:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:08.757804 | orchestrator | 2025-09-23 22:47:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:08.759086 | orchestrator | 2025-09-23 22:47:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:08.759140 | orchestrator | 2025-09-23 22:47:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:11.803052 | orchestrator | 2025-09-23 22:47:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:11.805484 | orchestrator | 2025-09-23 22:47:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:11.805580 | orchestrator | 2025-09-23 22:47:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:14.849483 | orchestrator | 2025-09-23 22:47:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:14.852411 | orchestrator | 2025-09-23 22:47:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:14.853042 | orchestrator | 2025-09-23 22:47:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:17.899686 | orchestrator | 2025-09-23 22:47:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:17.901222 | orchestrator | 2025-09-23 22:47:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:17.901287 | orchestrator | 2025-09-23 22:47:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:20.946344 | orchestrator | 2025-09-23 22:47:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:20.947470 | orchestrator | 2025-09-23 22:47:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:20.947533 | orchestrator | 2025-09-23 22:47:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:23.990635 | orchestrator | 2025-09-23 22:47:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:23.992487 | orchestrator | 2025-09-23 22:47:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:23.992520 | orchestrator | 2025-09-23 22:47:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:27.040127 | orchestrator | 2025-09-23 22:47:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:27.041422 | orchestrator | 2025-09-23 22:47:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:27.041455 | orchestrator | 2025-09-23 22:47:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:30.089992 | orchestrator | 2025-09-23 22:47:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:30.091442 | orchestrator | 2025-09-23 22:47:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:30.091476 | orchestrator | 2025-09-23 22:47:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:33.137648 | orchestrator | 2025-09-23 22:47:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:33.139735 | orchestrator | 2025-09-23 22:47:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:33.139798 | orchestrator | 2025-09-23 22:47:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:36.183067 | orchestrator | 2025-09-23 22:47:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:36.185599 | orchestrator | 2025-09-23 22:47:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:36.185666 | orchestrator | 2025-09-23 22:47:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:39.230357 | orchestrator | 2025-09-23 22:47:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:39.231357 | orchestrator | 2025-09-23 22:47:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:39.231392 | orchestrator | 2025-09-23 22:47:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:42.273818 | orchestrator | 2025-09-23 22:47:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:42.275141 | orchestrator | 2025-09-23 22:47:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:42.275168 | orchestrator | 2025-09-23 22:47:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:45.322103 | orchestrator | 2025-09-23 22:47:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:45.323491 | orchestrator | 2025-09-23 22:47:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:45.323531 | orchestrator | 2025-09-23 22:47:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:48.365800 | orchestrator | 2025-09-23 22:47:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:48.367978 | orchestrator | 2025-09-23 22:47:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:48.368022 | orchestrator | 2025-09-23 22:47:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:51.412412 | orchestrator | 2025-09-23 22:47:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:51.414671 | orchestrator | 2025-09-23 22:47:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:51.414711 | orchestrator | 2025-09-23 22:47:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:54.463451 | orchestrator | 2025-09-23 22:47:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:54.465056 | orchestrator | 2025-09-23 22:47:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:54.465148 | orchestrator | 2025-09-23 22:47:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:47:57.506071 | orchestrator | 2025-09-23 22:47:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:47:57.507131 | orchestrator | 2025-09-23 22:47:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:47:57.507294 | orchestrator | 2025-09-23 22:47:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:00.549512 | orchestrator | 2025-09-23 22:48:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:00.551377 | orchestrator | 2025-09-23 22:48:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:00.551463 | orchestrator | 2025-09-23 22:48:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:03.600754 | orchestrator | 2025-09-23 22:48:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:03.601621 | orchestrator | 2025-09-23 22:48:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:03.601747 | orchestrator | 2025-09-23 22:48:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:06.652420 | orchestrator | 2025-09-23 22:48:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:06.656693 | orchestrator | 2025-09-23 22:48:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:06.656771 | orchestrator | 2025-09-23 22:48:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:09.704810 | orchestrator | 2025-09-23 22:48:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:09.706207 | orchestrator | 2025-09-23 22:48:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:09.706329 | orchestrator | 2025-09-23 22:48:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:12.758781 | orchestrator | 2025-09-23 22:48:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:12.759891 | orchestrator | 2025-09-23 22:48:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:12.760285 | orchestrator | 2025-09-23 22:48:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:15.806217 | orchestrator | 2025-09-23 22:48:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:15.808174 | orchestrator | 2025-09-23 22:48:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:15.808438 | orchestrator | 2025-09-23 22:48:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:18.852973 | orchestrator | 2025-09-23 22:48:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:18.854511 | orchestrator | 2025-09-23 22:48:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:18.854573 | orchestrator | 2025-09-23 22:48:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:21.899530 | orchestrator | 2025-09-23 22:48:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:21.900801 | orchestrator | 2025-09-23 22:48:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:21.900837 | orchestrator | 2025-09-23 22:48:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:24.948497 | orchestrator | 2025-09-23 22:48:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:24.949393 | orchestrator | 2025-09-23 22:48:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:24.949639 | orchestrator | 2025-09-23 22:48:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:27.990546 | orchestrator | 2025-09-23 22:48:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:27.991873 | orchestrator | 2025-09-23 22:48:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:27.991911 | orchestrator | 2025-09-23 22:48:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:31.036663 | orchestrator | 2025-09-23 22:48:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:31.037511 | orchestrator | 2025-09-23 22:48:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:31.037545 | orchestrator | 2025-09-23 22:48:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:34.083926 | orchestrator | 2025-09-23 22:48:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:34.084998 | orchestrator | 2025-09-23 22:48:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:34.085028 | orchestrator | 2025-09-23 22:48:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:37.128482 | orchestrator | 2025-09-23 22:48:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:37.130950 | orchestrator | 2025-09-23 22:48:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:37.131124 | orchestrator | 2025-09-23 22:48:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:40.175340 | orchestrator | 2025-09-23 22:48:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:40.177233 | orchestrator | 2025-09-23 22:48:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:40.177335 | orchestrator | 2025-09-23 22:48:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:43.223017 | orchestrator | 2025-09-23 22:48:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:43.224764 | orchestrator | 2025-09-23 22:48:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:43.224891 | orchestrator | 2025-09-23 22:48:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:46.270500 | orchestrator | 2025-09-23 22:48:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:46.272015 | orchestrator | 2025-09-23 22:48:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:46.272349 | orchestrator | 2025-09-23 22:48:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:49.309646 | orchestrator | 2025-09-23 22:48:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:49.311444 | orchestrator | 2025-09-23 22:48:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:49.311506 | orchestrator | 2025-09-23 22:48:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:52.355380 | orchestrator | 2025-09-23 22:48:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:52.356985 | orchestrator | 2025-09-23 22:48:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:52.357067 | orchestrator | 2025-09-23 22:48:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:55.403672 | orchestrator | 2025-09-23 22:48:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:55.405498 | orchestrator | 2025-09-23 22:48:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:55.405586 | orchestrator | 2025-09-23 22:48:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:48:58.454326 | orchestrator | 2025-09-23 22:48:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:48:58.456005 | orchestrator | 2025-09-23 22:48:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:48:58.456037 | orchestrator | 2025-09-23 22:48:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:01.498656 | orchestrator | 2025-09-23 22:49:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:01.500651 | orchestrator | 2025-09-23 22:49:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:01.500687 | orchestrator | 2025-09-23 22:49:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:04.547040 | orchestrator | 2025-09-23 22:49:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:04.548933 | orchestrator | 2025-09-23 22:49:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:04.549118 | orchestrator | 2025-09-23 22:49:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:07.594588 | orchestrator | 2025-09-23 22:49:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:07.595892 | orchestrator | 2025-09-23 22:49:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:07.596146 | orchestrator | 2025-09-23 22:49:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:10.643289 | orchestrator | 2025-09-23 22:49:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:10.644894 | orchestrator | 2025-09-23 22:49:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:10.645096 | orchestrator | 2025-09-23 22:49:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:13.691343 | orchestrator | 2025-09-23 22:49:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:13.692156 | orchestrator | 2025-09-23 22:49:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:13.692444 | orchestrator | 2025-09-23 22:49:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:16.732314 | orchestrator | 2025-09-23 22:49:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:16.734623 | orchestrator | 2025-09-23 22:49:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:16.734646 | orchestrator | 2025-09-23 22:49:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:19.781291 | orchestrator | 2025-09-23 22:49:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:19.783372 | orchestrator | 2025-09-23 22:49:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:19.783734 | orchestrator | 2025-09-23 22:49:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:22.831430 | orchestrator | 2025-09-23 22:49:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:22.832537 | orchestrator | 2025-09-23 22:49:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:22.832573 | orchestrator | 2025-09-23 22:49:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:25.876616 | orchestrator | 2025-09-23 22:49:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:25.877301 | orchestrator | 2025-09-23 22:49:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:25.877336 | orchestrator | 2025-09-23 22:49:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:28.925204 | orchestrator | 2025-09-23 22:49:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:28.927399 | orchestrator | 2025-09-23 22:49:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:28.927449 | orchestrator | 2025-09-23 22:49:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:31.968481 | orchestrator | 2025-09-23 22:49:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:31.970604 | orchestrator | 2025-09-23 22:49:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:31.970735 | orchestrator | 2025-09-23 22:49:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:35.020113 | orchestrator | 2025-09-23 22:49:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:35.021910 | orchestrator | 2025-09-23 22:49:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:35.021945 | orchestrator | 2025-09-23 22:49:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:38.067817 | orchestrator | 2025-09-23 22:49:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:38.069442 | orchestrator | 2025-09-23 22:49:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:38.069477 | orchestrator | 2025-09-23 22:49:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:41.115590 | orchestrator | 2025-09-23 22:49:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:41.118958 | orchestrator | 2025-09-23 22:49:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:41.118991 | orchestrator | 2025-09-23 22:49:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:44.171321 | orchestrator | 2025-09-23 22:49:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:44.173356 | orchestrator | 2025-09-23 22:49:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:44.173485 | orchestrator | 2025-09-23 22:49:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:47.215724 | orchestrator | 2025-09-23 22:49:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:47.216902 | orchestrator | 2025-09-23 22:49:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:47.217360 | orchestrator | 2025-09-23 22:49:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:50.261671 | orchestrator | 2025-09-23 22:49:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:50.262990 | orchestrator | 2025-09-23 22:49:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:50.263025 | orchestrator | 2025-09-23 22:49:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:53.308117 | orchestrator | 2025-09-23 22:49:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:53.309275 | orchestrator | 2025-09-23 22:49:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:53.309396 | orchestrator | 2025-09-23 22:49:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:56.352718 | orchestrator | 2025-09-23 22:49:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:56.353551 | orchestrator | 2025-09-23 22:49:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:56.353591 | orchestrator | 2025-09-23 22:49:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:49:59.405797 | orchestrator | 2025-09-23 22:49:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:49:59.407439 | orchestrator | 2025-09-23 22:49:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:49:59.407477 | orchestrator | 2025-09-23 22:49:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:02.449742 | orchestrator | 2025-09-23 22:50:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:02.451595 | orchestrator | 2025-09-23 22:50:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:02.451769 | orchestrator | 2025-09-23 22:50:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:05.494524 | orchestrator | 2025-09-23 22:50:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:05.496182 | orchestrator | 2025-09-23 22:50:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:05.496345 | orchestrator | 2025-09-23 22:50:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:08.539771 | orchestrator | 2025-09-23 22:50:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:08.541045 | orchestrator | 2025-09-23 22:50:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:08.541164 | orchestrator | 2025-09-23 22:50:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:11.581943 | orchestrator | 2025-09-23 22:50:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:11.583466 | orchestrator | 2025-09-23 22:50:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:11.583812 | orchestrator | 2025-09-23 22:50:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:14.630689 | orchestrator | 2025-09-23 22:50:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:14.632709 | orchestrator | 2025-09-23 22:50:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:14.632746 | orchestrator | 2025-09-23 22:50:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:17.674611 | orchestrator | 2025-09-23 22:50:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:17.675650 | orchestrator | 2025-09-23 22:50:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:17.675953 | orchestrator | 2025-09-23 22:50:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:20.715785 | orchestrator | 2025-09-23 22:50:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:20.718360 | orchestrator | 2025-09-23 22:50:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:20.718399 | orchestrator | 2025-09-23 22:50:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:23.764615 | orchestrator | 2025-09-23 22:50:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:23.766565 | orchestrator | 2025-09-23 22:50:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:23.766650 | orchestrator | 2025-09-23 22:50:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:26.810384 | orchestrator | 2025-09-23 22:50:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:26.811979 | orchestrator | 2025-09-23 22:50:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:26.812010 | orchestrator | 2025-09-23 22:50:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:29.859424 | orchestrator | 2025-09-23 22:50:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:29.860958 | orchestrator | 2025-09-23 22:50:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:29.861047 | orchestrator | 2025-09-23 22:50:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:32.906802 | orchestrator | 2025-09-23 22:50:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:32.908604 | orchestrator | 2025-09-23 22:50:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:32.908687 | orchestrator | 2025-09-23 22:50:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:35.956005 | orchestrator | 2025-09-23 22:50:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:35.958765 | orchestrator | 2025-09-23 22:50:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:35.958877 | orchestrator | 2025-09-23 22:50:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:39.000203 | orchestrator | 2025-09-23 22:50:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:39.001246 | orchestrator | 2025-09-23 22:50:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:39.001327 | orchestrator | 2025-09-23 22:50:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:42.041044 | orchestrator | 2025-09-23 22:50:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:42.042527 | orchestrator | 2025-09-23 22:50:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:42.042669 | orchestrator | 2025-09-23 22:50:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:45.089392 | orchestrator | 2025-09-23 22:50:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:45.090875 | orchestrator | 2025-09-23 22:50:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:45.090916 | orchestrator | 2025-09-23 22:50:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:48.140574 | orchestrator | 2025-09-23 22:50:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:48.141595 | orchestrator | 2025-09-23 22:50:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:48.141667 | orchestrator | 2025-09-23 22:50:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:51.190591 | orchestrator | 2025-09-23 22:50:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:51.192334 | orchestrator | 2025-09-23 22:50:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:51.192373 | orchestrator | 2025-09-23 22:50:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:54.235346 | orchestrator | 2025-09-23 22:50:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:54.237469 | orchestrator | 2025-09-23 22:50:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:54.237934 | orchestrator | 2025-09-23 22:50:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:50:57.283624 | orchestrator | 2025-09-23 22:50:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:50:57.284779 | orchestrator | 2025-09-23 22:50:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:50:57.284956 | orchestrator | 2025-09-23 22:50:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:00.325589 | orchestrator | 2025-09-23 22:51:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:00.327127 | orchestrator | 2025-09-23 22:51:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:00.327153 | orchestrator | 2025-09-23 22:51:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:03.376056 | orchestrator | 2025-09-23 22:51:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:03.377654 | orchestrator | 2025-09-23 22:51:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:03.377687 | orchestrator | 2025-09-23 22:51:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:06.425647 | orchestrator | 2025-09-23 22:51:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:06.426535 | orchestrator | 2025-09-23 22:51:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:06.426912 | orchestrator | 2025-09-23 22:51:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:09.472250 | orchestrator | 2025-09-23 22:51:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:09.473278 | orchestrator | 2025-09-23 22:51:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:09.473395 | orchestrator | 2025-09-23 22:51:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:12.520195 | orchestrator | 2025-09-23 22:51:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:12.521780 | orchestrator | 2025-09-23 22:51:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:12.522301 | orchestrator | 2025-09-23 22:51:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:15.566392 | orchestrator | 2025-09-23 22:51:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:15.568013 | orchestrator | 2025-09-23 22:51:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:15.568046 | orchestrator | 2025-09-23 22:51:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:18.613905 | orchestrator | 2025-09-23 22:51:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:18.614237 | orchestrator | 2025-09-23 22:51:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:18.614317 | orchestrator | 2025-09-23 22:51:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:21.662082 | orchestrator | 2025-09-23 22:51:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:21.663609 | orchestrator | 2025-09-23 22:51:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:21.663656 | orchestrator | 2025-09-23 22:51:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:24.711540 | orchestrator | 2025-09-23 22:51:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:24.714389 | orchestrator | 2025-09-23 22:51:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:24.714666 | orchestrator | 2025-09-23 22:51:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:27.765390 | orchestrator | 2025-09-23 22:51:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:27.767522 | orchestrator | 2025-09-23 22:51:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:27.767575 | orchestrator | 2025-09-23 22:51:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:30.815710 | orchestrator | 2025-09-23 22:51:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:30.816974 | orchestrator | 2025-09-23 22:51:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:30.817006 | orchestrator | 2025-09-23 22:51:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:33.869455 | orchestrator | 2025-09-23 22:51:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:33.871121 | orchestrator | 2025-09-23 22:51:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:33.871521 | orchestrator | 2025-09-23 22:51:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:36.916597 | orchestrator | 2025-09-23 22:51:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:36.917920 | orchestrator | 2025-09-23 22:51:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:36.917958 | orchestrator | 2025-09-23 22:51:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:39.964647 | orchestrator | 2025-09-23 22:51:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:39.965647 | orchestrator | 2025-09-23 22:51:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:39.965680 | orchestrator | 2025-09-23 22:51:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:43.014539 | orchestrator | 2025-09-23 22:51:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:43.016052 | orchestrator | 2025-09-23 22:51:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:43.016080 | orchestrator | 2025-09-23 22:51:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:46.061714 | orchestrator | 2025-09-23 22:51:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:46.062544 | orchestrator | 2025-09-23 22:51:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:46.062585 | orchestrator | 2025-09-23 22:51:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:49.109555 | orchestrator | 2025-09-23 22:51:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:49.110609 | orchestrator | 2025-09-23 22:51:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:49.110752 | orchestrator | 2025-09-23 22:51:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:52.155392 | orchestrator | 2025-09-23 22:51:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:52.157079 | orchestrator | 2025-09-23 22:51:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:52.157127 | orchestrator | 2025-09-23 22:51:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:55.198813 | orchestrator | 2025-09-23 22:51:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:55.200159 | orchestrator | 2025-09-23 22:51:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:55.200192 | orchestrator | 2025-09-23 22:51:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:51:58.254398 | orchestrator | 2025-09-23 22:51:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:51:58.256746 | orchestrator | 2025-09-23 22:51:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:51:58.256996 | orchestrator | 2025-09-23 22:51:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:01.305905 | orchestrator | 2025-09-23 22:52:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:01.307530 | orchestrator | 2025-09-23 22:52:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:01.307586 | orchestrator | 2025-09-23 22:52:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:04.361537 | orchestrator | 2025-09-23 22:52:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:04.364891 | orchestrator | 2025-09-23 22:52:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:04.365069 | orchestrator | 2025-09-23 22:52:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:07.414655 | orchestrator | 2025-09-23 22:52:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:07.418582 | orchestrator | 2025-09-23 22:52:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:07.418619 | orchestrator | 2025-09-23 22:52:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:10.463416 | orchestrator | 2025-09-23 22:52:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:10.464720 | orchestrator | 2025-09-23 22:52:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:10.464762 | orchestrator | 2025-09-23 22:52:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:13.512550 | orchestrator | 2025-09-23 22:52:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:13.514325 | orchestrator | 2025-09-23 22:52:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:13.514390 | orchestrator | 2025-09-23 22:52:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:16.559416 | orchestrator | 2025-09-23 22:52:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:16.560456 | orchestrator | 2025-09-23 22:52:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:16.560591 | orchestrator | 2025-09-23 22:52:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:19.608904 | orchestrator | 2025-09-23 22:52:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:19.610328 | orchestrator | 2025-09-23 22:52:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:19.610365 | orchestrator | 2025-09-23 22:52:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:22.654328 | orchestrator | 2025-09-23 22:52:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:22.655915 | orchestrator | 2025-09-23 22:52:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:22.655950 | orchestrator | 2025-09-23 22:52:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:25.703482 | orchestrator | 2025-09-23 22:52:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:25.704756 | orchestrator | 2025-09-23 22:52:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:25.704801 | orchestrator | 2025-09-23 22:52:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:28.748521 | orchestrator | 2025-09-23 22:52:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:28.750418 | orchestrator | 2025-09-23 22:52:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:28.750494 | orchestrator | 2025-09-23 22:52:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:31.795188 | orchestrator | 2025-09-23 22:52:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:31.796469 | orchestrator | 2025-09-23 22:52:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:31.796500 | orchestrator | 2025-09-23 22:52:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:34.842957 | orchestrator | 2025-09-23 22:52:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:34.843977 | orchestrator | 2025-09-23 22:52:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:34.844007 | orchestrator | 2025-09-23 22:52:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:37.887217 | orchestrator | 2025-09-23 22:52:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:37.889408 | orchestrator | 2025-09-23 22:52:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:37.889439 | orchestrator | 2025-09-23 22:52:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:40.936286 | orchestrator | 2025-09-23 22:52:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:40.938191 | orchestrator | 2025-09-23 22:52:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:40.938240 | orchestrator | 2025-09-23 22:52:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:43.985503 | orchestrator | 2025-09-23 22:52:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:43.987016 | orchestrator | 2025-09-23 22:52:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:43.987067 | orchestrator | 2025-09-23 22:52:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:47.032481 | orchestrator | 2025-09-23 22:52:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:47.034083 | orchestrator | 2025-09-23 22:52:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:47.034181 | orchestrator | 2025-09-23 22:52:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:50.071678 | orchestrator | 2025-09-23 22:52:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:50.073441 | orchestrator | 2025-09-23 22:52:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:50.073536 | orchestrator | 2025-09-23 22:52:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:53.122315 | orchestrator | 2025-09-23 22:52:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:53.125016 | orchestrator | 2025-09-23 22:52:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:53.125047 | orchestrator | 2025-09-23 22:52:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:56.174477 | orchestrator | 2025-09-23 22:52:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:56.177454 | orchestrator | 2025-09-23 22:52:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:56.177487 | orchestrator | 2025-09-23 22:52:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:52:59.234518 | orchestrator | 2025-09-23 22:52:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:52:59.235290 | orchestrator | 2025-09-23 22:52:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:52:59.235326 | orchestrator | 2025-09-23 22:52:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:02.280550 | orchestrator | 2025-09-23 22:53:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:02.284331 | orchestrator | 2025-09-23 22:53:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:02.284432 | orchestrator | 2025-09-23 22:53:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:05.330484 | orchestrator | 2025-09-23 22:53:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:05.331853 | orchestrator | 2025-09-23 22:53:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:05.331934 | orchestrator | 2025-09-23 22:53:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:08.386678 | orchestrator | 2025-09-23 22:53:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:08.389244 | orchestrator | 2025-09-23 22:53:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:08.390952 | orchestrator | 2025-09-23 22:53:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:11.432038 | orchestrator | 2025-09-23 22:53:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:11.433366 | orchestrator | 2025-09-23 22:53:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:11.433688 | orchestrator | 2025-09-23 22:53:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:14.479381 | orchestrator | 2025-09-23 22:53:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:14.481252 | orchestrator | 2025-09-23 22:53:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:14.481284 | orchestrator | 2025-09-23 22:53:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:17.530768 | orchestrator | 2025-09-23 22:53:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:17.531439 | orchestrator | 2025-09-23 22:53:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:17.531531 | orchestrator | 2025-09-23 22:53:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:20.574958 | orchestrator | 2025-09-23 22:53:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:20.577130 | orchestrator | 2025-09-23 22:53:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:20.577266 | orchestrator | 2025-09-23 22:53:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:23.622700 | orchestrator | 2025-09-23 22:53:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:23.624054 | orchestrator | 2025-09-23 22:53:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:23.624083 | orchestrator | 2025-09-23 22:53:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:26.661972 | orchestrator | 2025-09-23 22:53:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:26.664640 | orchestrator | 2025-09-23 22:53:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:26.664848 | orchestrator | 2025-09-23 22:53:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:29.709441 | orchestrator | 2025-09-23 22:53:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:29.710451 | orchestrator | 2025-09-23 22:53:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:29.710522 | orchestrator | 2025-09-23 22:53:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:32.753925 | orchestrator | 2025-09-23 22:53:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:32.755138 | orchestrator | 2025-09-23 22:53:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:32.755172 | orchestrator | 2025-09-23 22:53:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:35.801455 | orchestrator | 2025-09-23 22:53:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:35.803514 | orchestrator | 2025-09-23 22:53:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:35.803606 | orchestrator | 2025-09-23 22:53:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:38.854490 | orchestrator | 2025-09-23 22:53:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:38.855134 | orchestrator | 2025-09-23 22:53:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:38.855290 | orchestrator | 2025-09-23 22:53:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:41.901719 | orchestrator | 2025-09-23 22:53:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:41.903760 | orchestrator | 2025-09-23 22:53:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:41.903797 | orchestrator | 2025-09-23 22:53:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:44.952035 | orchestrator | 2025-09-23 22:53:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:44.953651 | orchestrator | 2025-09-23 22:53:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:44.953705 | orchestrator | 2025-09-23 22:53:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:48.017302 | orchestrator | 2025-09-23 22:53:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:48.017803 | orchestrator | 2025-09-23 22:53:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:48.017898 | orchestrator | 2025-09-23 22:53:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:51.068851 | orchestrator | 2025-09-23 22:53:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:51.070530 | orchestrator | 2025-09-23 22:53:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:51.070577 | orchestrator | 2025-09-23 22:53:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:54.123218 | orchestrator | 2025-09-23 22:53:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:54.126352 | orchestrator | 2025-09-23 22:53:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:54.126400 | orchestrator | 2025-09-23 22:53:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:53:57.175750 | orchestrator | 2025-09-23 22:53:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:53:57.178602 | orchestrator | 2025-09-23 22:53:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:53:57.179091 | orchestrator | 2025-09-23 22:53:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:00.225533 | orchestrator | 2025-09-23 22:54:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:00.228187 | orchestrator | 2025-09-23 22:54:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:00.228287 | orchestrator | 2025-09-23 22:54:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:03.276495 | orchestrator | 2025-09-23 22:54:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:03.278433 | orchestrator | 2025-09-23 22:54:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:03.278473 | orchestrator | 2025-09-23 22:54:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:06.322818 | orchestrator | 2025-09-23 22:54:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:06.324896 | orchestrator | 2025-09-23 22:54:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:06.324924 | orchestrator | 2025-09-23 22:54:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:09.370392 | orchestrator | 2025-09-23 22:54:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:09.371248 | orchestrator | 2025-09-23 22:54:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:09.371276 | orchestrator | 2025-09-23 22:54:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:12.410633 | orchestrator | 2025-09-23 22:54:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:12.411991 | orchestrator | 2025-09-23 22:54:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:12.412039 | orchestrator | 2025-09-23 22:54:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:15.459291 | orchestrator | 2025-09-23 22:54:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:15.461075 | orchestrator | 2025-09-23 22:54:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:15.461331 | orchestrator | 2025-09-23 22:54:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:18.505445 | orchestrator | 2025-09-23 22:54:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:18.507809 | orchestrator | 2025-09-23 22:54:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:18.507840 | orchestrator | 2025-09-23 22:54:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:21.551543 | orchestrator | 2025-09-23 22:54:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:21.551848 | orchestrator | 2025-09-23 22:54:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:21.551999 | orchestrator | 2025-09-23 22:54:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:24.601777 | orchestrator | 2025-09-23 22:54:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:24.604211 | orchestrator | 2025-09-23 22:54:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:24.604273 | orchestrator | 2025-09-23 22:54:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:27.649193 | orchestrator | 2025-09-23 22:54:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:27.651393 | orchestrator | 2025-09-23 22:54:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:27.651433 | orchestrator | 2025-09-23 22:54:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:30.695622 | orchestrator | 2025-09-23 22:54:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:30.697012 | orchestrator | 2025-09-23 22:54:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:30.697049 | orchestrator | 2025-09-23 22:54:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:33.735853 | orchestrator | 2025-09-23 22:54:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:33.737200 | orchestrator | 2025-09-23 22:54:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:33.737297 | orchestrator | 2025-09-23 22:54:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:36.783190 | orchestrator | 2025-09-23 22:54:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:36.784963 | orchestrator | 2025-09-23 22:54:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:36.785464 | orchestrator | 2025-09-23 22:54:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:39.835007 | orchestrator | 2025-09-23 22:54:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:39.837224 | orchestrator | 2025-09-23 22:54:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:39.837272 | orchestrator | 2025-09-23 22:54:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:42.884702 | orchestrator | 2025-09-23 22:54:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:42.886517 | orchestrator | 2025-09-23 22:54:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:42.886550 | orchestrator | 2025-09-23 22:54:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:45.934221 | orchestrator | 2025-09-23 22:54:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:45.936904 | orchestrator | 2025-09-23 22:54:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:45.937066 | orchestrator | 2025-09-23 22:54:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:48.977806 | orchestrator | 2025-09-23 22:54:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:48.978801 | orchestrator | 2025-09-23 22:54:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:48.978838 | orchestrator | 2025-09-23 22:54:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:52.029027 | orchestrator | 2025-09-23 22:54:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:52.031234 | orchestrator | 2025-09-23 22:54:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:52.031268 | orchestrator | 2025-09-23 22:54:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:55.082376 | orchestrator | 2025-09-23 22:54:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:55.083493 | orchestrator | 2025-09-23 22:54:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:55.083543 | orchestrator | 2025-09-23 22:54:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:54:58.125641 | orchestrator | 2025-09-23 22:54:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:54:58.126657 | orchestrator | 2025-09-23 22:54:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:54:58.126702 | orchestrator | 2025-09-23 22:54:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:01.168053 | orchestrator | 2025-09-23 22:55:01 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:01.168560 | orchestrator | 2025-09-23 22:55:01 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:01.168593 | orchestrator | 2025-09-23 22:55:01 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:04.215438 | orchestrator | 2025-09-23 22:55:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:04.217322 | orchestrator | 2025-09-23 22:55:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:04.217416 | orchestrator | 2025-09-23 22:55:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:07.265897 | orchestrator | 2025-09-23 22:55:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:07.267309 | orchestrator | 2025-09-23 22:55:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:07.267566 | orchestrator | 2025-09-23 22:55:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:10.310293 | orchestrator | 2025-09-23 22:55:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:10.311898 | orchestrator | 2025-09-23 22:55:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:10.311928 | orchestrator | 2025-09-23 22:55:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:13.351393 | orchestrator | 2025-09-23 22:55:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:13.352772 | orchestrator | 2025-09-23 22:55:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:13.352828 | orchestrator | 2025-09-23 22:55:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:16.398302 | orchestrator | 2025-09-23 22:55:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:16.399150 | orchestrator | 2025-09-23 22:55:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:16.399183 | orchestrator | 2025-09-23 22:55:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:19.439226 | orchestrator | 2025-09-23 22:55:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:19.441341 | orchestrator | 2025-09-23 22:55:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:19.441373 | orchestrator | 2025-09-23 22:55:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:22.487596 | orchestrator | 2025-09-23 22:55:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:22.490258 | orchestrator | 2025-09-23 22:55:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:22.490621 | orchestrator | 2025-09-23 22:55:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:25.533764 | orchestrator | 2025-09-23 22:55:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:25.535619 | orchestrator | 2025-09-23 22:55:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:25.535651 | orchestrator | 2025-09-23 22:55:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:28.580477 | orchestrator | 2025-09-23 22:55:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:28.582439 | orchestrator | 2025-09-23 22:55:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:28.582535 | orchestrator | 2025-09-23 22:55:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:31.629657 | orchestrator | 2025-09-23 22:55:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:31.631305 | orchestrator | 2025-09-23 22:55:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:31.631341 | orchestrator | 2025-09-23 22:55:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:34.677723 | orchestrator | 2025-09-23 22:55:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:34.679843 | orchestrator | 2025-09-23 22:55:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:34.679953 | orchestrator | 2025-09-23 22:55:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:37.727487 | orchestrator | 2025-09-23 22:55:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:37.729992 | orchestrator | 2025-09-23 22:55:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:37.730561 | orchestrator | 2025-09-23 22:55:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:40.777593 | orchestrator | 2025-09-23 22:55:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:40.779369 | orchestrator | 2025-09-23 22:55:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:40.779565 | orchestrator | 2025-09-23 22:55:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:43.823656 | orchestrator | 2025-09-23 22:55:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:43.824944 | orchestrator | 2025-09-23 22:55:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:43.824994 | orchestrator | 2025-09-23 22:55:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:46.869271 | orchestrator | 2025-09-23 22:55:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:46.869991 | orchestrator | 2025-09-23 22:55:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:46.870141 | orchestrator | 2025-09-23 22:55:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:49.915304 | orchestrator | 2025-09-23 22:55:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:49.916691 | orchestrator | 2025-09-23 22:55:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:49.916983 | orchestrator | 2025-09-23 22:55:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:52.966310 | orchestrator | 2025-09-23 22:55:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:52.968631 | orchestrator | 2025-09-23 22:55:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:52.968657 | orchestrator | 2025-09-23 22:55:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:56.021389 | orchestrator | 2025-09-23 22:55:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:56.022411 | orchestrator | 2025-09-23 22:55:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:56.022449 | orchestrator | 2025-09-23 22:55:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:55:59.070202 | orchestrator | 2025-09-23 22:55:59 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:55:59.071253 | orchestrator | 2025-09-23 22:55:59 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:55:59.071328 | orchestrator | 2025-09-23 22:55:59 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:02.118161 | orchestrator | 2025-09-23 22:56:02 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:02.120791 | orchestrator | 2025-09-23 22:56:02 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:02.120816 | orchestrator | 2025-09-23 22:56:02 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:05.168495 | orchestrator | 2025-09-23 22:56:05 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:05.169513 | orchestrator | 2025-09-23 22:56:05 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:05.169799 | orchestrator | 2025-09-23 22:56:05 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:08.216578 | orchestrator | 2025-09-23 22:56:08 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:08.218580 | orchestrator | 2025-09-23 22:56:08 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:08.218612 | orchestrator | 2025-09-23 22:56:08 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:11.265159 | orchestrator | 2025-09-23 22:56:11 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:11.266119 | orchestrator | 2025-09-23 22:56:11 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:11.266216 | orchestrator | 2025-09-23 22:56:11 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:14.307235 | orchestrator | 2025-09-23 22:56:14 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:14.309438 | orchestrator | 2025-09-23 22:56:14 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:14.309511 | orchestrator | 2025-09-23 22:56:14 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:17.358253 | orchestrator | 2025-09-23 22:56:17 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:17.359904 | orchestrator | 2025-09-23 22:56:17 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:17.359942 | orchestrator | 2025-09-23 22:56:17 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:20.409970 | orchestrator | 2025-09-23 22:56:20 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:20.411488 | orchestrator | 2025-09-23 22:56:20 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:20.411521 | orchestrator | 2025-09-23 22:56:20 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:23.462540 | orchestrator | 2025-09-23 22:56:23 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:23.464516 | orchestrator | 2025-09-23 22:56:23 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:23.464609 | orchestrator | 2025-09-23 22:56:23 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:26.513255 | orchestrator | 2025-09-23 22:56:26 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:26.514283 | orchestrator | 2025-09-23 22:56:26 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:26.514379 | orchestrator | 2025-09-23 22:56:26 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:29.554530 | orchestrator | 2025-09-23 22:56:29 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:29.556662 | orchestrator | 2025-09-23 22:56:29 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:29.556791 | orchestrator | 2025-09-23 22:56:29 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:32.603661 | orchestrator | 2025-09-23 22:56:32 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:32.606164 | orchestrator | 2025-09-23 22:56:32 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:32.606217 | orchestrator | 2025-09-23 22:56:32 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:35.658509 | orchestrator | 2025-09-23 22:56:35 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:35.659852 | orchestrator | 2025-09-23 22:56:35 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:35.659890 | orchestrator | 2025-09-23 22:56:35 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:38.704727 | orchestrator | 2025-09-23 22:56:38 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:38.705699 | orchestrator | 2025-09-23 22:56:38 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:38.705922 | orchestrator | 2025-09-23 22:56:38 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:41.756137 | orchestrator | 2025-09-23 22:56:41 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:41.758285 | orchestrator | 2025-09-23 22:56:41 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:41.758350 | orchestrator | 2025-09-23 22:56:41 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:44.808990 | orchestrator | 2025-09-23 22:56:44 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:44.810594 | orchestrator | 2025-09-23 22:56:44 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:44.811291 | orchestrator | 2025-09-23 22:56:44 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:47.858526 | orchestrator | 2025-09-23 22:56:47 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:47.859590 | orchestrator | 2025-09-23 22:56:47 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:47.859629 | orchestrator | 2025-09-23 22:56:47 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:50.901685 | orchestrator | 2025-09-23 22:56:50 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:50.901900 | orchestrator | 2025-09-23 22:56:50 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:50.901924 | orchestrator | 2025-09-23 22:56:50 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:53.949595 | orchestrator | 2025-09-23 22:56:53 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:53.950755 | orchestrator | 2025-09-23 22:56:53 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:53.950809 | orchestrator | 2025-09-23 22:56:53 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:56:56.998563 | orchestrator | 2025-09-23 22:56:56 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:56:56.999149 | orchestrator | 2025-09-23 22:56:56 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:56:56.999177 | orchestrator | 2025-09-23 22:56:56 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:00.036567 | orchestrator | 2025-09-23 22:57:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:00.039194 | orchestrator | 2025-09-23 22:57:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:00.039230 | orchestrator | 2025-09-23 22:57:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:03.084834 | orchestrator | 2025-09-23 22:57:03 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:03.087396 | orchestrator | 2025-09-23 22:57:03 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:03.087440 | orchestrator | 2025-09-23 22:57:03 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:06.132088 | orchestrator | 2025-09-23 22:57:06 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:06.134196 | orchestrator | 2025-09-23 22:57:06 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:06.134236 | orchestrator | 2025-09-23 22:57:06 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:09.179361 | orchestrator | 2025-09-23 22:57:09 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:09.181585 | orchestrator | 2025-09-23 22:57:09 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:09.181616 | orchestrator | 2025-09-23 22:57:09 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:12.228917 | orchestrator | 2025-09-23 22:57:12 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:12.230209 | orchestrator | 2025-09-23 22:57:12 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:12.230241 | orchestrator | 2025-09-23 22:57:12 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:15.272113 | orchestrator | 2025-09-23 22:57:15 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:15.274652 | orchestrator | 2025-09-23 22:57:15 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:15.274709 | orchestrator | 2025-09-23 22:57:15 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:18.327202 | orchestrator | 2025-09-23 22:57:18 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:18.327319 | orchestrator | 2025-09-23 22:57:18 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:18.327341 | orchestrator | 2025-09-23 22:57:18 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:21.373335 | orchestrator | 2025-09-23 22:57:21 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:21.375070 | orchestrator | 2025-09-23 22:57:21 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:21.375104 | orchestrator | 2025-09-23 22:57:21 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:24.419175 | orchestrator | 2025-09-23 22:57:24 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:24.421071 | orchestrator | 2025-09-23 22:57:24 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:24.421161 | orchestrator | 2025-09-23 22:57:24 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:27.469621 | orchestrator | 2025-09-23 22:57:27 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:27.471165 | orchestrator | 2025-09-23 22:57:27 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:27.471239 | orchestrator | 2025-09-23 22:57:27 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:30.512331 | orchestrator | 2025-09-23 22:57:30 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:30.513127 | orchestrator | 2025-09-23 22:57:30 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:30.513233 | orchestrator | 2025-09-23 22:57:30 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:33.555494 | orchestrator | 2025-09-23 22:57:33 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:33.556560 | orchestrator | 2025-09-23 22:57:33 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:33.556594 | orchestrator | 2025-09-23 22:57:33 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:36.601610 | orchestrator | 2025-09-23 22:57:36 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:36.602942 | orchestrator | 2025-09-23 22:57:36 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:36.602971 | orchestrator | 2025-09-23 22:57:36 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:39.652806 | orchestrator | 2025-09-23 22:57:39 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:39.654253 | orchestrator | 2025-09-23 22:57:39 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:39.654282 | orchestrator | 2025-09-23 22:57:39 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:42.701292 | orchestrator | 2025-09-23 22:57:42 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:42.702191 | orchestrator | 2025-09-23 22:57:42 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:42.702222 | orchestrator | 2025-09-23 22:57:42 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:45.747274 | orchestrator | 2025-09-23 22:57:45 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:45.749210 | orchestrator | 2025-09-23 22:57:45 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:45.749363 | orchestrator | 2025-09-23 22:57:45 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:48.798677 | orchestrator | 2025-09-23 22:57:48 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:48.800696 | orchestrator | 2025-09-23 22:57:48 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:48.800726 | orchestrator | 2025-09-23 22:57:48 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:51.846694 | orchestrator | 2025-09-23 22:57:51 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:51.848313 | orchestrator | 2025-09-23 22:57:51 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:51.848348 | orchestrator | 2025-09-23 22:57:51 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:54.893383 | orchestrator | 2025-09-23 22:57:54 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:54.896668 | orchestrator | 2025-09-23 22:57:54 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:54.896843 | orchestrator | 2025-09-23 22:57:54 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:57:57.946320 | orchestrator | 2025-09-23 22:57:57 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:57:57.947575 | orchestrator | 2025-09-23 22:57:57 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:57:57.947605 | orchestrator | 2025-09-23 22:57:57 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:00.995879 | orchestrator | 2025-09-23 22:58:00 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:00.997690 | orchestrator | 2025-09-23 22:58:00 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:00.997786 | orchestrator | 2025-09-23 22:58:00 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:04.049315 | orchestrator | 2025-09-23 22:58:04 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:04.050154 | orchestrator | 2025-09-23 22:58:04 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:04.050253 | orchestrator | 2025-09-23 22:58:04 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:07.095629 | orchestrator | 2025-09-23 22:58:07 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:07.097377 | orchestrator | 2025-09-23 22:58:07 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:07.097419 | orchestrator | 2025-09-23 22:58:07 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:10.142780 | orchestrator | 2025-09-23 22:58:10 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:10.144717 | orchestrator | 2025-09-23 22:58:10 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:10.144770 | orchestrator | 2025-09-23 22:58:10 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:13.190190 | orchestrator | 2025-09-23 22:58:13 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:13.191069 | orchestrator | 2025-09-23 22:58:13 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:13.191159 | orchestrator | 2025-09-23 22:58:13 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:16.236302 | orchestrator | 2025-09-23 22:58:16 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:16.238247 | orchestrator | 2025-09-23 22:58:16 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:16.238300 | orchestrator | 2025-09-23 22:58:16 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:19.286857 | orchestrator | 2025-09-23 22:58:19 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:19.288700 | orchestrator | 2025-09-23 22:58:19 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:19.288869 | orchestrator | 2025-09-23 22:58:19 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:22.333956 | orchestrator | 2025-09-23 22:58:22 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:22.336528 | orchestrator | 2025-09-23 22:58:22 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:22.336596 | orchestrator | 2025-09-23 22:58:22 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:25.383343 | orchestrator | 2025-09-23 22:58:25 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:25.384915 | orchestrator | 2025-09-23 22:58:25 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:25.385060 | orchestrator | 2025-09-23 22:58:25 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:28.433582 | orchestrator | 2025-09-23 22:58:28 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:28.435264 | orchestrator | 2025-09-23 22:58:28 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:28.435296 | orchestrator | 2025-09-23 22:58:28 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:31.476558 | orchestrator | 2025-09-23 22:58:31 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:31.478279 | orchestrator | 2025-09-23 22:58:31 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:31.478311 | orchestrator | 2025-09-23 22:58:31 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:34.525335 | orchestrator | 2025-09-23 22:58:34 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:34.526792 | orchestrator | 2025-09-23 22:58:34 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:34.526823 | orchestrator | 2025-09-23 22:58:34 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:37.569603 | orchestrator | 2025-09-23 22:58:37 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:37.571685 | orchestrator | 2025-09-23 22:58:37 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:37.571718 | orchestrator | 2025-09-23 22:58:37 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:40.618363 | orchestrator | 2025-09-23 22:58:40 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:40.620091 | orchestrator | 2025-09-23 22:58:40 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:40.620301 | orchestrator | 2025-09-23 22:58:40 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:43.663653 | orchestrator | 2025-09-23 22:58:43 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:43.667867 | orchestrator | 2025-09-23 22:58:43 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:43.667917 | orchestrator | 2025-09-23 22:58:43 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:46.714790 | orchestrator | 2025-09-23 22:58:46 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:46.715975 | orchestrator | 2025-09-23 22:58:46 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:46.716126 | orchestrator | 2025-09-23 22:58:46 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:49.757242 | orchestrator | 2025-09-23 22:58:49 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:49.759376 | orchestrator | 2025-09-23 22:58:49 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:49.759590 | orchestrator | 2025-09-23 22:58:49 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:52.809687 | orchestrator | 2025-09-23 22:58:52 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:52.811787 | orchestrator | 2025-09-23 22:58:52 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:52.811838 | orchestrator | 2025-09-23 22:58:52 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:55.856884 | orchestrator | 2025-09-23 22:58:55 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:55.858823 | orchestrator | 2025-09-23 22:58:55 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:55.858915 | orchestrator | 2025-09-23 22:58:55 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:58:58.909370 | orchestrator | 2025-09-23 22:58:58 | INFO  | Task afe0a176-dafc-40e7-8a9f-5d8ab37437fb is in state STARTED 2025-09-23 22:58:58.910659 | orchestrator | 2025-09-23 22:58:58 | INFO  | Task 79bd4c39-3d19-4f3f-ae60-c469f28cfc4f is in state STARTED 2025-09-23 22:58:58.910974 | orchestrator | 2025-09-23 22:58:58 | INFO  | Wait 1 second(s) until the next check 2025-09-23 22:59:01.564067 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2025-09-23 22:59:01.567430 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-09-23 22:59:02.231446 | 2025-09-23 22:59:02.231595 | PLAY [Post output play] 2025-09-23 22:59:02.256307 | 2025-09-23 22:59:02.256509 | LOOP [stage-output : Register sources] 2025-09-23 22:59:02.319932 | 2025-09-23 22:59:02.320110 | TASK [stage-output : Check sudo] 2025-09-23 22:59:03.160338 | orchestrator | sudo: a password is required 2025-09-23 22:59:03.353921 | orchestrator | ok: Runtime: 0:00:00.009157 2025-09-23 22:59:03.368671 | 2025-09-23 22:59:03.368823 | LOOP [stage-output : Set source and destination for files and folders] 2025-09-23 22:59:03.408218 | 2025-09-23 22:59:03.408508 | TASK [stage-output : Build a list of source, dest dictionaries] 2025-09-23 22:59:03.475470 | orchestrator | ok 2025-09-23 22:59:03.483979 | 2025-09-23 22:59:03.484087 | LOOP [stage-output : Ensure target folders exist] 2025-09-23 22:59:03.898058 | orchestrator | ok: "docs" 2025-09-23 22:59:03.898531 | 2025-09-23 22:59:04.142969 | orchestrator | ok: "artifacts" 2025-09-23 22:59:04.373462 | orchestrator | ok: "logs" 2025-09-23 22:59:04.399809 | 2025-09-23 22:59:04.399964 | LOOP [stage-output : Copy files and folders to staging folder] 2025-09-23 22:59:04.434206 | 2025-09-23 22:59:04.434432 | TASK [stage-output : Make all log files readable] 2025-09-23 22:59:04.705964 | orchestrator | ok 2025-09-23 22:59:04.715445 | 2025-09-23 22:59:04.715591 | TASK [stage-output : Rename log files that match extensions_to_txt] 2025-09-23 22:59:04.750559 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:04.762901 | 2025-09-23 22:59:04.763025 | TASK [stage-output : Discover log files for compression] 2025-09-23 22:59:04.787090 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:04.795428 | 2025-09-23 22:59:04.795613 | LOOP [stage-output : Archive everything from logs] 2025-09-23 22:59:04.835634 | 2025-09-23 22:59:04.835793 | PLAY [Post cleanup play] 2025-09-23 22:59:04.845181 | 2025-09-23 22:59:04.845267 | TASK [Set cloud fact (Zuul deployment)] 2025-09-23 22:59:04.900040 | orchestrator | ok 2025-09-23 22:59:04.913258 | 2025-09-23 22:59:04.913364 | TASK [Set cloud fact (local deployment)] 2025-09-23 22:59:04.937338 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:04.948076 | 2025-09-23 22:59:04.948175 | TASK [Clean the cloud environment] 2025-09-23 22:59:05.512949 | orchestrator | 2025-09-23 22:59:05 - clean up servers 2025-09-23 22:59:06.274323 | orchestrator | 2025-09-23 22:59:06 - testbed-manager 2025-09-23 22:59:06.368808 | orchestrator | 2025-09-23 22:59:06 - testbed-node-4 2025-09-23 22:59:06.458515 | orchestrator | 2025-09-23 22:59:06 - testbed-node-0 2025-09-23 22:59:06.548104 | orchestrator | 2025-09-23 22:59:06 - testbed-node-2 2025-09-23 22:59:06.644386 | orchestrator | 2025-09-23 22:59:06 - testbed-node-3 2025-09-23 22:59:06.741881 | orchestrator | 2025-09-23 22:59:06 - testbed-node-5 2025-09-23 22:59:06.841318 | orchestrator | 2025-09-23 22:59:06 - testbed-node-1 2025-09-23 22:59:06.936593 | orchestrator | 2025-09-23 22:59:06 - clean up keypairs 2025-09-23 22:59:06.958852 | orchestrator | 2025-09-23 22:59:06 - testbed 2025-09-23 22:59:06.988853 | orchestrator | 2025-09-23 22:59:06 - wait for servers to be gone 2025-09-23 22:59:15.699207 | orchestrator | 2025-09-23 22:59:15 - clean up ports 2025-09-23 22:59:15.875465 | orchestrator | 2025-09-23 22:59:15 - 0a508350-954e-4034-80fb-127600f3af04 2025-09-23 22:59:16.642849 | orchestrator | 2025-09-23 22:59:16 - 0f3b4af0-1afb-4a39-9b22-2c6d6c2c2a07 2025-09-23 22:59:16.880814 | orchestrator | 2025-09-23 22:59:16 - 2a72add7-a386-4673-a114-4869f46d87df 2025-09-23 22:59:17.086603 | orchestrator | 2025-09-23 22:59:17 - 30778571-0825-4754-900a-c66befc54776 2025-09-23 22:59:17.307339 | orchestrator | 2025-09-23 22:59:17 - 4347530c-3b15-41ea-8190-9eeef2f833e1 2025-09-23 22:59:17.723453 | orchestrator | 2025-09-23 22:59:17 - c9051d6a-8cd2-4ea4-aea7-06d311e811bb 2025-09-23 22:59:17.956862 | orchestrator | 2025-09-23 22:59:17 - f513b74f-4aec-4175-a098-baba8ff8fa96 2025-09-23 22:59:18.198694 | orchestrator | 2025-09-23 22:59:18 - clean up volumes 2025-09-23 22:59:18.312095 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-1-node-base 2025-09-23 22:59:18.348747 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-4-node-base 2025-09-23 22:59:18.389258 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-5-node-base 2025-09-23 22:59:18.429648 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-0-node-base 2025-09-23 22:59:18.472593 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-3-node-base 2025-09-23 22:59:18.519449 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-manager-base 2025-09-23 22:59:18.562421 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-2-node-5 2025-09-23 22:59:18.600241 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-8-node-5 2025-09-23 22:59:18.638953 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-1-node-4 2025-09-23 22:59:18.681456 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-5-node-5 2025-09-23 22:59:18.724000 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-6-node-3 2025-09-23 22:59:18.763532 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-2-node-base 2025-09-23 22:59:18.802689 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-7-node-4 2025-09-23 22:59:18.842745 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-0-node-3 2025-09-23 22:59:18.888241 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-3-node-3 2025-09-23 22:59:18.930520 | orchestrator | 2025-09-23 22:59:18 - testbed-volume-4-node-4 2025-09-23 22:59:18.970896 | orchestrator | 2025-09-23 22:59:18 - disconnect routers 2025-09-23 22:59:19.101157 | orchestrator | 2025-09-23 22:59:19 - testbed 2025-09-23 22:59:20.039931 | orchestrator | 2025-09-23 22:59:20 - clean up subnets 2025-09-23 22:59:20.085469 | orchestrator | 2025-09-23 22:59:20 - subnet-testbed-management 2025-09-23 22:59:20.246529 | orchestrator | 2025-09-23 22:59:20 - clean up networks 2025-09-23 22:59:20.422319 | orchestrator | 2025-09-23 22:59:20 - net-testbed-management 2025-09-23 22:59:20.745144 | orchestrator | 2025-09-23 22:59:20 - clean up security groups 2025-09-23 22:59:20.784891 | orchestrator | 2025-09-23 22:59:20 - testbed-management 2025-09-23 22:59:20.918366 | orchestrator | 2025-09-23 22:59:20 - testbed-node 2025-09-23 22:59:21.041157 | orchestrator | 2025-09-23 22:59:21 - clean up floating ips 2025-09-23 22:59:21.074360 | orchestrator | 2025-09-23 22:59:21 - 81.163.193.123 2025-09-23 22:59:21.430361 | orchestrator | 2025-09-23 22:59:21 - clean up routers 2025-09-23 22:59:21.531664 | orchestrator | 2025-09-23 22:59:21 - testbed 2025-09-23 22:59:22.496547 | orchestrator | ok: Runtime: 0:00:17.192427 2025-09-23 22:59:22.500475 | 2025-09-23 22:59:22.500688 | PLAY RECAP 2025-09-23 22:59:22.500814 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2025-09-23 22:59:22.500879 | 2025-09-23 22:59:22.632999 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-09-23 22:59:22.635337 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2025-09-23 22:59:23.379682 | 2025-09-23 22:59:23.379833 | PLAY [Cleanup play] 2025-09-23 22:59:23.395816 | 2025-09-23 22:59:23.395945 | TASK [Set cloud fact (Zuul deployment)] 2025-09-23 22:59:23.451175 | orchestrator | ok 2025-09-23 22:59:23.460402 | 2025-09-23 22:59:23.460598 | TASK [Set cloud fact (local deployment)] 2025-09-23 22:59:23.484770 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:23.492706 | 2025-09-23 22:59:23.493425 | TASK [Clean the cloud environment] 2025-09-23 22:59:24.625908 | orchestrator | 2025-09-23 22:59:24 - clean up servers 2025-09-23 22:59:25.082611 | orchestrator | 2025-09-23 22:59:25 - clean up keypairs 2025-09-23 22:59:25.100631 | orchestrator | 2025-09-23 22:59:25 - wait for servers to be gone 2025-09-23 22:59:25.143494 | orchestrator | 2025-09-23 22:59:25 - clean up ports 2025-09-23 22:59:25.212468 | orchestrator | 2025-09-23 22:59:25 - clean up volumes 2025-09-23 22:59:25.271853 | orchestrator | 2025-09-23 22:59:25 - disconnect routers 2025-09-23 22:59:25.302905 | orchestrator | 2025-09-23 22:59:25 - clean up subnets 2025-09-23 22:59:25.324206 | orchestrator | 2025-09-23 22:59:25 - clean up networks 2025-09-23 22:59:25.489657 | orchestrator | 2025-09-23 22:59:25 - clean up security groups 2025-09-23 22:59:25.525695 | orchestrator | 2025-09-23 22:59:25 - clean up floating ips 2025-09-23 22:59:25.549802 | orchestrator | 2025-09-23 22:59:25 - clean up routers 2025-09-23 22:59:26.034948 | orchestrator | ok: Runtime: 0:00:01.299858 2025-09-23 22:59:26.038757 | 2025-09-23 22:59:26.038927 | PLAY RECAP 2025-09-23 22:59:26.039032 | orchestrator | ok: 2 changed: 1 unreachable: 0 failed: 0 skipped: 1 rescued: 0 ignored: 0 2025-09-23 22:59:26.039085 | 2025-09-23 22:59:26.163876 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2025-09-23 22:59:26.164869 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-09-23 22:59:26.882115 | 2025-09-23 22:59:26.882273 | PLAY [Base post-fetch] 2025-09-23 22:59:26.897561 | 2025-09-23 22:59:26.897708 | TASK [fetch-output : Set log path for multiple nodes] 2025-09-23 22:59:26.953908 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:26.971071 | 2025-09-23 22:59:26.971386 | TASK [fetch-output : Set log path for single node] 2025-09-23 22:59:27.011793 | orchestrator | ok 2025-09-23 22:59:27.020920 | 2025-09-23 22:59:27.021043 | LOOP [fetch-output : Ensure local output dirs] 2025-09-23 22:59:27.497256 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/work/logs" 2025-09-23 22:59:27.779689 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/work/artifacts" 2025-09-23 22:59:28.050153 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/05e476f00ce44eecbe8a69c52322685d/work/docs" 2025-09-23 22:59:28.066346 | 2025-09-23 22:59:28.066473 | LOOP [fetch-output : Collect logs, artifacts and docs] 2025-09-23 22:59:29.022960 | orchestrator | changed: .d..t...... ./ 2025-09-23 22:59:29.023444 | orchestrator | changed: All items complete 2025-09-23 22:59:29.023577 | 2025-09-23 22:59:29.743623 | orchestrator | changed: .d..t...... ./ 2025-09-23 22:59:30.451338 | orchestrator | changed: .d..t...... ./ 2025-09-23 22:59:30.482986 | 2025-09-23 22:59:30.483119 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2025-09-23 22:59:30.518785 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:30.521547 | orchestrator | skipping: Conditional result was False 2025-09-23 22:59:30.546333 | 2025-09-23 22:59:30.546437 | PLAY RECAP 2025-09-23 22:59:30.546537 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2025-09-23 22:59:30.546580 | 2025-09-23 22:59:30.675525 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-09-23 22:59:30.677803 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-09-23 22:59:31.394104 | 2025-09-23 22:59:31.394253 | PLAY [Base post] 2025-09-23 22:59:31.408535 | 2025-09-23 22:59:31.408670 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2025-09-23 22:59:32.389477 | orchestrator | changed 2025-09-23 22:59:32.398504 | 2025-09-23 22:59:32.398617 | PLAY RECAP 2025-09-23 22:59:32.398688 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2025-09-23 22:59:32.398757 | 2025-09-23 22:59:32.516767 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-09-23 22:59:32.519126 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2025-09-23 22:59:33.306177 | 2025-09-23 22:59:33.306341 | PLAY [Base post-logs] 2025-09-23 22:59:33.316559 | 2025-09-23 22:59:33.316682 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2025-09-23 22:59:33.770275 | localhost | changed 2025-09-23 22:59:33.787500 | 2025-09-23 22:59:33.787690 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2025-09-23 22:59:33.826133 | localhost | ok 2025-09-23 22:59:33.833029 | 2025-09-23 22:59:33.833212 | TASK [Set zuul-log-path fact] 2025-09-23 22:59:33.850911 | localhost | ok 2025-09-23 22:59:33.864231 | 2025-09-23 22:59:33.864396 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-09-23 22:59:33.902037 | localhost | ok 2025-09-23 22:59:33.908794 | 2025-09-23 22:59:33.908973 | TASK [upload-logs : Create log directories] 2025-09-23 22:59:34.413468 | localhost | changed 2025-09-23 22:59:34.416397 | 2025-09-23 22:59:34.416524 | TASK [upload-logs : Ensure logs are readable before uploading] 2025-09-23 22:59:34.917983 | localhost -> localhost | ok: Runtime: 0:00:00.005490 2025-09-23 22:59:34.927341 | 2025-09-23 22:59:34.927548 | TASK [upload-logs : Upload logs to log server] 2025-09-23 22:59:35.475579 | localhost | Output suppressed because no_log was given 2025-09-23 22:59:35.479632 | 2025-09-23 22:59:35.479851 | LOOP [upload-logs : Compress console log and json output] 2025-09-23 22:59:35.537990 | localhost | skipping: Conditional result was False 2025-09-23 22:59:35.543385 | localhost | skipping: Conditional result was False 2025-09-23 22:59:35.555608 | 2025-09-23 22:59:35.555840 | LOOP [upload-logs : Upload compressed console log and json output] 2025-09-23 22:59:35.603235 | localhost | skipping: Conditional result was False 2025-09-23 22:59:35.603859 | 2025-09-23 22:59:35.607467 | localhost | skipping: Conditional result was False 2025-09-23 22:59:35.621036 | 2025-09-23 22:59:35.621296 | LOOP [upload-logs : Upload console log and json output]